July 2011 Archives

I wish someone would "integrate" my data!

bridgwatera | No Comments
| More

If you happen to know me personally (or you're a 'web-friend' perhaps), then you probably know that I am moving house. It's a process that I like to sum up with the following alliterative memo: moving, migrating, momentum and a maelstrom of madness.

During the process of moving there are meetings to be scheduled, forms to fill out, boxes to be ticked, plans for "migration day" to be drawn up, support services (we'll call them removal men!) to be booked for the big day, costs to be shouldered, addresses to be updated and so on.

... and yes you guessed it, I could do with some help -- and every time a data integration press release hits my inbox my blood starts to boil.

Why don't we have these kind data support services for individuals?

I mean Informatica has this week expanded its existing product portfolio to give customers more options to meet their diverse business continuity, big data and operational data integration needs.

Is this too tenuous a link to house moving? Please stay with me...

Moving_house.JPG

Data replication is the process of real-time data sharing to ensure consistency between redundant database resources to improve reliability, fault-tolerance, latency and accessibility.

"A part of the Informatica Platform, Informatica's new data replication technology includes Informatica Fast Clone which automates the cloning of application data and Informatica Data Replication which manages the capture, routing and delivery of high-volume transaction data across diverse systems in real time with minimal source system impact," said the company, in a press statement.

Informatica suggests that use cases include faster, less costly database migrations with zero downtime, mirroring of operational data for continual access during maintenance and upgrade cycles, active data warehousing with more up-to-date information and live transactional auditing with a full audit trail of every transaction event.

The sad fact is that I will soon be experiencing some downtime of my own and I could certainly do with some data mirroring as I switch all my personal details to my new property.

I'll probably get a sore while back humping boxes too -- ah, at least Informatica doesn't claim to solve that problem, yet!

SAP & Google: mapping the "where" of information

bridgwatera | No Comments
| More

SAP has collaborated with Google to help customers manage large data volumes. The concept uses visual displays to "leverage" (ouch!) SAP business analytics software with location-based data capabilities, thus allowing users to interact with real-time information via Google Maps.

So this is basically: enterprise applications meets Google Maps and Google Earth.

Using location-based intelligence capabilities, SAP envisions "bringing corporate information to life" via Google's interactive maps, satellite images and even street-level views.

"Customers can analyse their businesses in a geospatial context," says SAP.

Confused by that statement? I'm not surprised, so what does it mean?

SAP says wants businesses to understand the "where" of their information, as well as global, regional and local trends and see how all these elements are impacted by different scenarios.

What the company is basically saying is that corporate data is increasingly meta tagged with geospatial information. So the best way of looking at that data to get the "where" element is via a map -- and it might as well be a nice digital (let's not forget intuitive) Google Map.

As developers and DataBase Administrators (DBAs) face new challenges with managing so-called "big data" volumes, this could prove to be a helpful tool in terms of data management.

SAP provides the following example scenarios for how this product could work:

● A telecom operator could use Google Earth and SAP BusinessObjects Explorer software to perform dropped-call analysis and pinpoint the geo-coordinates of faulty towers.

● A mortgage bank could perform risk assessment of its mortgage portfolio by overlaying foreclosure and default data with the location of loans on Google Maps.

"Today, more and more information is being geo-tagged, and it is unlocking an entirely new dimension for enterprise data," said Sanjay Poonen, president, global solutions, SAP AG. "We aim to provide our customers the opportunity to tap into the power of business analytics combined with location intelligence through a geographic view and use rich, interactive analytics to respond to events as they unfold in real time."

The below video explains this technology nicely -- it's a two minute view.

The 'dynamic' data centre is also the 'democratic' data centre

bridgwatera | No Comments
| More

Application Delivery Networking company F5 Networks this week takes the wraps off of its BIG-IP version 11 software. This release is part of what F5 envisions as a dynamic data centre where services are provisioned on an application-by-application basis for better visibility and reporting of IT resource utilisation.

So what does that mean then?

A dynamic data centre has a "clear view" of users, applications and the network itself. This (in theory) enables IT departments to deploy application services on a per-application basis as rapidly as they can provision virtual machines to host the apps.

But what do they mean by "clear view"?

Being able to see users and apps on a clear path means that services such as user authentication, data protection, traffic management and application acceleration can be all be brought to bear. If you like, the dynamic data centre is also the democratic data centre where freedom of speech and individual freedom reign openly.

F5 iApp Ecosystem - DevCentral.png

"As applications are virtualised and deployed in the cloud, the network becomes an integral part of the application delivery infrastructure and ultimately defines user productivity," said Karl Triebes, CTO and SVP of Product Development at F5.

"With the rise of virtualisation, an organisation can go from a bare metal server to a running application in just minutes. However, it might take days if not weeks to provision the underlying network and application delivery services to make that application production-ready and available to users.

"With BIG-IP v11, F5 changes the traditional network paradigm and makes it as simple to provision the Application Delivery Network as it is to create new virtual machines. BIG-IP v11's application control plane architecture aligns application control, visibility, and manageability and allows additional services to be added on demand as current conditions require," he added.

So now you've got the basics, what does this mean in practice?

F5 says that with BIG-IP v11, organisations can deploy applications up to 100x faster, enabling them to quickly react to changes in business needs.

100x faster, really?

That's what F5 says -- but it's pretty arbitrary figure to be honest; probably plucked from the air given that spinning up a virtual cloud app can be done in less that a minute.

Regardless, the company further states that by managing application services as a whole rather than as individual devices and objects, IT teams can streamline operations and dramatically lower OpEx. BIG-IP v11 enables application-specific configurations to move with applications as new instances are created locally, virtually, or in the cloud. Additionally, F5® solutions now provide application-specific intelligence for a variety of popular applications from vendors such as Oracle, Microsoft, SAP, and others.

BIG-IP v11 gives organizations the power to configure, monitor, grow, and secure their business applications independently to gain greater efficiency, increase agility, and reduce CapEx and OpEx.

OpEx?

Operating Expenses silly, do try and keep up.

Finally (although this 500 word summary is only about a quarter of the company's 2000 word press release), BIG-IP v11's powerful new iApps (trademark deleted) technology is said to simplify deployment of application services with easy-to-use templates to associate specific sets of services with single, per-application policies that are portable -- even as the application itself moves beyond the walls of the physical data centre.

Cheap sarcastic journalist quips aside, this is clearly a major release for F5 and the company has put considerable weight behind this news announcement. Interesting pickings for sure.

Selling the Sybase + SAP syncopated solution

bridgwatera | No Comments
| More

I recently spent some time examining the market-facing side of Sybase and SAP's combined solutions from a customer perspective for an International Sybase User Group analysis feature.

My argument was that the market needs Sybase whether they are the largest database player or not in terms of total sales. Oracle, IBM and Microsoft (if anything) are a little less agile and potentially weighed down by their colossal corporate girth -- even if they do rack up more unit sales.

I think I'm pretty sure about that statement, although I do question whether I was just keen to try and use the expression "colossal corporate girth" ☺

sybase.jpg

The question now arises of whether we are about to witness a bit of a three-in-a-bed is a crowd scenario with Sybase, SAP and Oracle all coming into the mix.

Estimates suggest that over three-quarters of SAP's ERP customers use Oracle database technology at the back end. This is not great for SAP (obviously) as the two firms are pitched very much head-to-head when it comes to analytical applications. But with Sybase now in the arsenal, SAP arguably has a route to chip into its customers' usage of Oracle technologies.

Indeed, SAP appears to be moving with all speed to approve formal certification of ASE as the RDBMS (Relational DataBase Management System) of choice for its own ERP offerings. In line with this move, SAP will also want to endorse Sybase IQ as its favoured RDBMS for the data warehouse behind SAP BusinessObjects analytics.

This week then, we hear news of Sybase opening up mobile solutions from its current portfolio, to partners within the global SAP ecosystem. This marks the first time (as far as I am aware) that SAP value-added resellers (VARs) currently authorised to sell SAP BusinessObjects and SAP Business All-in-One solutions get the opportunity to sell select Sybase-branded enterprise mobility apps and solutions for application development, device management and security.

The release (arguably) broadens the availability of combined SAP and Sybase solutions within the market and delivers new mobile and revenue opportunities to more than 3,000 potential SAP partners.

"Making Sybase solutions available through our existing reselling partners around the world allows us to reach a wider array of customers, both large and small, and allows customers to team up with the best partners to meet their distinct mobility needs," said Friedrich Neumeyer, senior vice president, volume reseller and service partners, global ecosystem and channels, SAP.

"Partners will benefit from new mobility offerings that they can monetize and use to attract new customers. Together, we can more easily offer the type of industry- and role-specific mobility solutions that meet market demands, as the ever-increasing influx of more powerful smartphones, tablets and applications capture the public's imagination," added Neumeyer.

The following Sybase solutions are available through the SAP ecosystem today: Afaria: a mobile device management and security solution -- Sybase Unwired Platform: a mobile enterprise application platform -- Sybase Mobile Sales application for SAP CRM: access to the SAP Customer Relationship Management (SAP CRM) applications -- Sybase Mobile Workflow application for SAP Business Suite: a solution for mobile workers to complete business processes.

So the ice has been broken, the cherry has been popped, SAP and Sybase are out in public -- there, they've said it - does that feel better?

My analysis piece (yes journalists can write like analysts too!) ran with a headline: Sybase & SAP: Database Darlings, or Shotgun Wedding...? drop me a line and I'll send it to you if you like.

Third-party software is a manageable threat

bridgwatera | No Comments
| More

This is a guest post for the Computer Weekly Developer Network by Rutul Dave, senior software developer with Coverity and & Chris Adlard, the company's EMEA marketing director.

In a recent webcast, Forrester Research (alongside Coverity) presented the topic: "Is Untested Third-Party Code Threatening Your Business?" It was clear from the questions and comments that this issue is a growing concern -- and businesses are looking to gain better visibility into software risks across development organisations.

In one of the polling questions, 58% of the attendees indicated that they had experienced a negative impact on their business due to third-party code supplied from outsourcing partners, open source and offshore teams. Considering this result, it wasn't surprising that many were interested to hear directly from Dr. Chenxi Wang, principle analyst with Forrester Research, about the key findings from the Forrester Consulting Software Integrity Risk Report.

(Note: This reports suggests that all organisations use third party code and 50% use it extensively or regularly. According to the report, less than 50% of the respondents tested third-party code with the same rigor as internally developed code.)

The software integrity research surveyed 336 software development influencers in North America and Europe on current practices and market trends for managing software quality, security and safety.

The findings revealed that today's environment businesses are impacted by software defects.

Dr. Wang explained, "While a development team might be implementing better practices in their internal development process like rigorous code reviews and unit testing, they always fell on the sword when they leveraged third-party code that had not been held to the same testing rigour."

Due to these impacts, development teams were now being held more accountable for customer satisfaction. Unsurprisingly, software integrity is now an essential part of the development organisation's responsibilities. Developers need to ensure their software meets the highest standards for quality, works reliably in safety-critical systems, and is free of exploitable security vulnerabilities.

Third-party_option_key.JPG

As the findings were shared, many attendees asked the popular question: "How can developers get ahead of these trends and reduce the risk of using insufficiently tested software?"

The short answer is by using available technology to your advantage. Developers have the least visibility into the quality of third-party code. This is clearly where an automated code testing technique like static analysis would be the quickest and most cost-effective way to gain that visibility. Static analysis can help identify the most critical bugs in C/C++, Java, and C# codebases, and to provide early warning into any software or business risks. It gives developers complete control of the entire software supply chain - in other words - providing complete visibility of critical issues.

As the presentations concluded and Q&A began, the majority of the questions raised surrounded the topic of implementation. It was clear that the attendees - 39% of which openly confessed to not currently using a static analysis solution - were now very ready to take action and to invest in a technology that could make software integrity happen.

On the webcast, developers asked a number of additional technical questions, not all of which we had time to answer in the allocated session. In our conclusion, we suggested that static analysis is an essential part of developer-side software testing and that organisations need to look at code governance solutions (such as Coverity), in order to more effectively manage software code provided by 3rd party suppliers.

Ed: just one name-checking product plug right at the end? We'll allow that ☺

Will 'citizen developers' be as bad as 'citizen journalists'?

bridgwatera | 1 Comment
| More

I have never been a fan of so-called citizen journalists. It's fine if you publish your own blog and share it with your friends and social media contacts, but the distinction between the way a trained reporter will present a story and the opinionated ramblings of a blogger needs to be drawn.

Jolie O'Dell presents a lucid account here: How To Tell A Journalist From A Blogger

Having said all that, many people would no doubt rank a citizen journalist's worth over and above that of certain News of the World journalists right now, so best I tread carefully.

The_Citizens'_Manual.jpg

Moving this thought onward to 'citizen developers', Gartner predicts that drag-and-drop non-coding development will make up "at least a quarter of new business applications by 2014" due to shoestring IT budgets and end users frustrations.

Gartner defines a 'citizen developer' as an end user who creates new business applications for consumption by others using development and runtime environments sanctioned by corporate IT.

Gartner is, thankfully, wary of this development...

"End-user application development (EUAD) is nothing new, but the risks and opportunities it presents have become much greater in recent years," said Ian Finley, research vice president at Gartner. "In the past, EUAD posed limited risks to the organisation because it was typically limited to a single user or workgroup. However, end users can now build departmental, enterprise and even public applications. While this change enables organisations to empower end users and releases IT resources, it also heightens the risks of EUAD."



Seemingly consigned to the inevitability of this new trend, Gartner analysts said that IT organisations need to adapt to the new realities of EUAD and build a citizen developer support programme.



But surely we need to highlight the danger element here even more?

Speaking to Computer Weekly Developer Network exclusively, James Peel, product manager with open source network and application monitoring tool specialist Opsview said, "Undoubtedly, going down the citizen developer route will be an attractive proposition for a number of organisations due to the lower costs."

"However, businesses need to ensure such applications don't have a negative impact on the performance of their IT. Although citizen developers 'may' have good IT skills, there is no guarantee that the applications they develop won't have bugs which may cause IT performance problems further down the line. Furthermore, unlike with traditional enterprise applications, they are less likely to benefit from formal quality assurance or user acceptance testing. There also won't be the external support to help diagnose and troubleshoot problems, which means there will be a drain on existing IT department resources," added Peel.

If we follow Opsviews argument here, organisations looking to embrace applications from citizen developers must have the necessary IT monitoring tools in place in order to identify any potential performance problems before they significantly impact on the business.

Deloitte: the killer app could be killed off

bridgwatera | No Comments
| More

Self-styled 'business advisory' firm Deloitte has said that that brands (and the software developers building for them) will find it even more challenging to ensure their mobile apps stand out to consumers in the current economic landscape.

The so-called "killer app" is becoming harder to build and harder to single out as a true market leader.

Augmented_GeoTravel.jpg

Deloitte's research has found that less than one percent of apps published by a selection of global consumer and healthcare brands were downloaded more than a million times -- and that only 20% of the apps were downloaded enough times to appear in Deloitte's analysis.

The firm also found:

  • Apps using location information through a portability function (81%) were most likely to be downloaded.
  • 45% of consumers with a smartphone download an app at least once a week.
Deloitte predicts the rapid proliferation of apps has lead to increasingly discerning consumers, as app stores become more popular and users become more mature.

Apps that used the following functions were far more likely to be downloaded:

- Portability - 81%
- Accelerometer - 77%
- Sophisticated touch screen use - 61%
- Location-based services - 61%
- Camera - 59%


According to Deloitte media partner Howard Davies, "Location data is important for the evolution of the app market. If data gathered locally could be exchanged with data from the cloud, whether about location, environment and motion or specific to that individual, then targeted advertising could be developed and this would help brands to make money from their apps."

"Consumers need to see the benefits of receiving more personalised advertising on their smartphones, devices that have previously been advert free. They also need to consent to let their personal data to be used in this manner. Likewise, brands need to co-operate with traditional advertising vendors, companies with similar ambitions in other industries or even competitors."

"At the moment the feed of data from handsets to media planners is still inconsistent across operators and platforms and as a result, difficult to use on any scale by advertisers. This will need to be resolved in order for brands to make the most from apps," added Davies.

Deloitte also suggests that developers eyeing a cross-platform deployment option should think carefully; the company estimates that the cost of developing the same application for two platforms is 160% of the cost of developing for one.

Although Deloitte appears to have missed the "write once, run anywhere" message so dearly loved by companies such as Nokia Qt and beyond, some of the firms wider findings here may indeed be thought provoking.

Sybase & Gartner: democracy needed in data warehouse analytics

bridgwatera | No Comments
| More

As I'll shortly be on the road to Sybase's TechWave developer and database professional symposium in September, my ear is uncommonly well tuned to news emanating from the company's PR portholes (or should that be portals?) just now.

Hot off of its press release grill this month is news of the general availability of Sybase IQ 15.3 -- this business intelligence/analytics and data warehouse focused relational database management product is now powered by a so-called "new generation" of shared everything Massively Parallel Processing (MPP) technology.

"The transformative power of business analytics cannot be fully realised until IT departments provide all users with real-time access to information assets across the enterprise and enable integration of analytics directly into core operational processes," said Sybase, in a press statement.

If we accept the above statement to be true, then it's a great shame to read Gartner's estimation that more than 70% of enterprise data warehouses actually serve only back-office or limited departmental use cases.

So then, we need to bring more democracy to data warehouse analytics I suppose.

Or if you want the Sybase (PR headline) version:

Sybase Brings 'Intelligence For Everyone' To The Enterprise
With New Parallel Distributed Query And Advanced Workload Management Capabilities

Also in the mix with this news announcement is the use of Sybase IQ PlexQ technology. This offering dynamically balances query workloads across nodes on the grid for massively parallel processing of complex analytics at speeds -- or if you want it more plainly: it scales to support real-time access to thousands of users, multiple mixed workloads and massive datasets.

Sybase SAP.png

"Enterprise IT departments are demanding new ways to enhance their enterprise data warehouse systems to increase intelligence, productivity and analytics capability across the enterprise. With Sybase IQ, enterprises can run complex analytical models and algorithms within the database, eliminating the need for data mining algorithms inside business logic and accelerating overall performance of complex queries," said Brian Vink, vice president, product management, Sybase, an SAP company.

So what lesson for the future here then?

• In the data-centric world, business analytics will be increasingly integrated into applications.
• Business analytics will also be increasingly integrated into business workflows.
• Democratic real-time access to information by all stakeholders is a cornerstone of business analytics best practice.
• Sybase CEO John Chen will continue to rank analysts as more important than journalists.

Ah well, you can't have everything in this world can you?

Gordon Ramsay, snapLogic data connection automation & an amuse bouche

bridgwatera | No Comments
| More

I spent yesterday lunchtime at a cloud computing discussion lunch at Gordon Ramsay's Claridge's with a company called snapLogic.

Before you ask: starter was trout with cucumber sorbet (not bad), main was chicken and mash with truffles (completely amazing) and pud was a baked pear tart thingy that I didn't care for -- but I have to say that the amuse bouche selection was to die for.

Honestly, one lunch at Ramsay's and I'm clearly full of it. I digress.

F word.jpg

OK so chicken and mash aside what's the tech beef here? Well, snapLogic offers cloud connection technologies. Its 'snaps' are described as application- and language-neutral connectors. So this is data connectivity for the cloud, in the cloud - or back on earth if need be.

The problem is (says snapLogic) most companies are not incorporating an 'information consolidation' policy into their wider cloud adoption policy. SaaS renders previous integration techniques obsolete, says the company. Hand-coded integration scripts do not work in the cloud -- and this (in the main) is due to the fact that SaaS vendors do not provide access to their underlying databases.

It was my pleasure to share the discussion forum yesterday with esteemed analyst and blogger Phil Wainewright, who provided a white paper on the subject of cloud integration (sorry! - connection) challenges.

Here's a selection of talking points drawn from an analysis of Phil's white paper which is entitled the "Cloud Connection Imperative"...

• Does the need for continuous real time connection in and of itself pose additional challenges for data services integration?
• When we talk about "classic integration architectures" and say that the old ways no longer work - when do you think the penny will drop and the cloud connection revolution will be widely recognised?
• Where is the gap between conventional integration channels and the needs of the cloud?
• Does the "continuously updated" nature of cloud applications make it harder to connect to static on-premise apps?
• You describe the connection imperative as "tactical incidents with a strategic framework" - do you simply mean that "we need a plan" or are you eluding to something more complex here?
• Mobile users add an additional stream to the connection challenge right? How do we start to approach these challenges?
• You talk about a "single thread of recurring capabilities" across a connection infrastructure... at what point can we start to identify and group commonalities inside this process?

So these are some of the talking points that developers might want to consider if they are thinking about automation technologies for connecting cloud data.

Onward from this, snapLogic and Wainewright recommend not a replacement of the existing "corporate connection infrastructure" (which may be largely hand-coded scripts), but an enhancement and extension of it.

Is this data integration? Not for Wainewright, he prefers the term data connection -- in fact, if you buy his argument, you might say that connection is the new integration this season.

Domino's serves up Rackspace RackConnect hybrid hosting solution

bridgwatera | No Comments
| More

As the cloud computing model of IT delivery becomes ever more clearly defined in terms of its scope and capabilities, the more widespread use of its services is being seen in mainstream commercial operations from retailing to new forms of online media.

Computer Weekly last reported on Domino's Pizza approach to information technology here, with an interview with Colin Rees, who is IT director for the company.

Domino.png

This week, we learn of Domino's buying into RackConnect, an integrated cloud hosting and dedicated managed hosting service from Rackspace. The new service is designed to give Domino's a scalable and cost-effective platform that will support the execution of the company's ambitious growth strategy.

Key to Domino's success (says the company) is a multichannel retail strategy, which generated a 63% increase in online revenues alone in 2010. "To help drive revenue and future growth, Domino's sought a hosting service that would meet the evolving demands of its online business -- and allow its internal IT team to focus less on the maintenance of its online properties and business applications and more on innovation," said the company, in a press statement.

Rackspace senior VP David Kelly has said that the fact that Domino's Pizza has chosen the RackConnect service highlights a common need amongst businesses for a hybrid hosting platform that offers them more flexibility.

Domino's Rees said, "In 2010, over a third of our UK delivered orders were taken online, so having a reliable hosting solution is essential to our continued success. I knew that we needed to outsource our hosting infrastructure if we were to remain on our current growth trajectory and keep improving our customer experience. We wanted the best of both worlds, though: the high level of security and configurability that dedicated managed hosting affords a business, and the scalability of cloud hosting. Rackspace's RackConnect solution ticks all these boxes, and more."

So what does it do?

Rackspace says that RackConnect will enable Domino's to select which applications are placed where in the managed hosting infrastructure.

According to Rackspace, "For example, applications that require a high level of security, such as an internal email system, can be hosted on dedicated physical hardware. Domino's will also be able to take advantage of the on-demand scalability of The Rackspace Cloud for developing new smartphone or tablet applications, or handling the demands of a digital marketing campaign."

... and the only bad news here? Domino's TWO FOR TUESDAY offer was yesterday, so you'll have to wait until next week for your double deep pan pepperoni passion OK?

Size matters, in the big data cloud at least

bridgwatera | No Comments
| More

As it's the 4th of July today, I am anticipating a slightly slower news day for those of us in technology. Not that America runs the world you understand, but I think a day off for Silicon Valley will have some impact.

Given this reality, I want to revisit a subject I recently covered at feature level with some additional comments from companies that did not fit into my first draft.

The subject? -- big data in the cloud.

But what is big data -- and it is just marketing spin?

Well, it is spin to a degree, but we use the term to refer to datasets that have been colluded and collected into large "lumps" or perhaps terabytes (or even petabytes) of potentially dynamic fast moving data.

Crucially, it is the tools, processes and procedures around the data itself that define what big data is.

So what does the market think we should do cope with data management at this level as it now, either logically or inevitably, takes up residence in the cloud?

big cloud.JPG

"Businesses wanting to build big data stacks in the cloud need to make sure that they take the time to assess their options before choosing the technology that they are going to build on. There are a number of proprietary and open source tools out there for the taking, but picking the right one is not necessarily an easy task," said Jason Reid, CTO of hosted and managed data centre-based company Carrenza.

"Some IT vendors, welcome increasing data volumes. But despite what storage vendors may have you believe, you can't just keep throwing servers at your exponentially expanding data assets. Not all data is born equal and the importance of different types of data is far from constant; whereas today's data might need to be replicated and recoverable in seconds, the chances are that last week's data is less critical and can be stored on a cheaper medium," said Keith Tilley, managing director UK and executive vice president Europe for SunGard Availability Service

"Big data analytical queries will create a new set of workload management problems for IT. This workload will be small to begin with (users submitting queries to running single reports), but will expand soon into a massive amount of requests (applications generating queries automatically to generate trends or continuously looking for patterns)," said Ken Hertzler, VP, product management & product marketing, Platform Computing.

Hertzler continued, "Whether the data sits in a cloud, or internally in a data center, workload scheduling and management of the MapReduce requests is not a trivial matter to solve. Users will expect results in guaranteed response times, high availability, multiple simultaneous requests on the same data sets, flexibility, and a host of other requirements that ensure the results are accurate and on-time.

"At the scale of big data, organising and arranging masses of information so it's easy to analyse becomes a herculean task in itself: if you wait for data to be organised any insight you gain could well be out of date. The current generation of BI and analytics tools allow 'train of thought' analysis: querying unstructured stores of data in minutes or even seconds as and when needed, rather than in hours or days by appointment. As a result, organisations need to make sure that either their service providers can guarantee this level of access or that their internal cloud projects are using suitable technology. Otherwise, big data will only ever yield old news," said Roger Llewellyn, CEO of data warehousing, business intelligence and analytics company Kognitio.

I could drop a conclusion in here, but this market is still-nascent as I keep saying, so let's keep the communication channels open and revisit this topic soon.

Developers now working on Internet Explorer 10

bridgwatera | No Comments
| More

Microsoft has released the second version of what it calls the "platform preview" of Internet Explorer 10. This is the developer-centric stripped down GUI version of its next browser presented in a "bonnet up" style so that programmers can tinker and learn.

With this update, Microsoft says that IE10 continues to deliver support for "site-ready HTML5 technologies", as well as improving performance through support for several new technologies like CSS3 Positioned Floats, HTML5 Drag-drop, File Reader API, Media Query Listeners, and initial support for HTML5 Forms.

According to Microsoft, "HTML5 application performance improves across the board, as well as the ability to deliver better performance with more efficient use of battery life with new technologies like Web Workers with Channel Messaging, Async script support, and others."

Web application security is said to have been improved, using the same markup with support for HTML5 Sandbox and iframe isolation. Microsoft is keen to highlight IE10's continuation of IE9's precedent for enabling web applications to do more in the browser without plug-ins.

"It also continues the pattern of offloading work to the parts of a PC that are best suited for them, like the GPU for graphics, and different processor cores for background compilation of JavaScript," says Microsoft's IE development lead Dean Hachamovitch, in his IE10 blog post.

IE10 supports CSS3 Positioned Floats to enable text flows around figures on a page, building on IE10 PP1's support for CSS3 grid, multi-column and flexbox -- if that sentence is beyond your technical scope, then this image tells the story far more clearly:

floats.png

The test drive at this link here illustrates how different browsers today give different results when running the same Web pages even though they all claim support for the same standards. The quality and correctness of different browsers' HTML5 engines vary widely.

How.png

Web application security has also been addressed as illustrated in the image below -- the IE10 platform preview now supports HTML5 Sandbox and iframe isolation, an important component to web application security:

Secure dog.png

Speaking directly to Microsoft's Ryan Gavin, senior director of Internet Explorer business and marketing, and Rob Mauceri, partner group program manager for Internet Explorer, Computer Weekly Developer Network learned that Microsoft has played a very active role up on more than one W3C working groups.

One can only hope that Microsoft's proximity to working groups and new specs as they are laid down will result in maximum interoperability as the product evolves.

About this Archive

This page is an archive of entries from July 2011 listed from newest to oldest.

June 2011 is the previous archive.

August 2011 is the next archive.

Find recent content on the main index or look in the archives to find all content.