July 2012 Archives

Programmer's paradise: two months on a tropical island

bridgwatera | No Comments
| More

Programmers like to work in isolation.

It's a well-worn stereotype, the solitary programmer (C programmers you know who you are) punching away at the keyboard in solitary confinement surrounded only by a lifetime supply of Pepsi and pizza.

New-age collaboration tools, Agile scrum methods and social enterprise initiatives will do all they can to move developers out of these old ways, but a degree of self-contained solitary focus will most likely prevail whatever the development shop scenario.

It's interesting then to read news on the BBC News technology pages which reports that 12 programmers are being sought to go and live on a tropical island and code away to their heart's content for two months.

The "Come Hack With Us" hackathon (come one, surely they could have called it "Robinson Cursor" or something?!) is being stages after organiser Walter Heck staged a similar retreat way up in Alaska that was apparently an "amazing experience" by all accounts.

Heck is currently looking for sponsors for his project. No kidding he is.

What is more, applications will have to make their own way to the island and pay a small attendance fee. Heck says this is a move he has taken in order to keep out those "planning to party all the time", which he says, "could be really detrimental to the atmosphere," after all.

Heck is based in the Malaysia and is considering an island somewhere in the Philippines.

The BBC reports that 4000 developers have already registered an interest on the Come Hack With Us website.

Perhaps its not as crazy as it sounds; Microsoft apparently used Australia as a fervent developer testing ground to work through much of Windows 7 --- the country speaks English, works in relative isolation to the rest of the world's time zones and geographies, plus they have lots of sunshine too.

Maybe that's the secret recipe for success?

tropical.png

When (service) virtualisation imitates life

bridgwatera | No Comments
| More

IT surveys come and go.

Of course we must first accept the universal truth that all technology surveys are flawed, biased and contrived from the outset, with "loaded" agendas designed to pander to the thinly disguised corporate message set pertaining to the vendor brand laying claim to the resulting "research" produced.

With that mental inoculation on board then, CA Technologies (the artist formerly known as Computer Associates) has this week gone public on a study entitled 'Business Benefits of Service Virtualization', which has found that conventional approaches to software development is hampering business.

NOTE: The study questioned three hundred in-house software developers in the UK, France and Germany on core operational issues such as number of releases per year, functionality expectations of users.

Paradoxically, CA is currently championing a software application development process called Service Virtualization.

This new approach to the development and testing of applications uses a virtual service environment designed to imitate a real production environment.

Spikes & Swings

Yes that's all very nice. But we know that (in practice) real world data flows in even the most foreseeable of environments are subject to unpredictable spikes, swings and fluctuations.

So model and virtualise as we might, we're never quite close enough the perfect curve of algorithmic logic that truly describes how an application will fare once it has to live and breathe in post deployment reality.

Standing his ground on this issue is Justin Vaughan-Brown, strategic relationships director EMEA - service virtualization, CA Technologies. Vaughan-Brown argues that Service Virtualization enables teams to develop and test an application using a virtual service environment that has been configured to "imitate a real production environment" -- and, crucially, "While providing the ability to change the behaviour and data of these virtual services easily in order to validate different scenarios."

According to CA's study, UK enterprises are worse off compared to France and Germany when it comes to application development and testing issues relating to cost, quality and time to market. The greatest concerns highlighted were:

· 59% of UK respondents cited quality and time-to-market on integration testing as a major challenge (compared to 48% overall)
· 41% had issues with performance testing compared (32% overall)
· 32% expressed concerns with regression testing

"This suggests earlier testing - when bugs are easier and cheaper to fix - is not as effective as it could be, and testing at later stages in the software lifecycle, such as integration and performance test phases, is more costly and causes significant application release delays," said CA's Vaughan-Brown.

CA's clarion call here is that we should wake up to realise that outdated application development and testing is having a major impact on UK enterprises with 76% of respondents citing loss of reputation in the market as a major concern.

Just how fine-tuned has CA produced this new system to be then?

Despite my cheap sarcasm and snide jibes, the company clearly doesn't roll out major new systems just for its own amusement. As to the empirical, pragmatic, first-hand experiences of programmers when using service virtualisation (or Service Virtualization if you prefer)... now that survey result we would like to see.

Cloud development kudos lies in 'enhanced' multi-tenancy

bridgwatera | No Comments
| More

People, users, customers, lawyers and (possibly even) software application developers appear to have been worried about cloud security for some time now.

But many initial misgivings were quite quickly dismissed as users started to realise the proposition that the cloud vendors were coming to market with i.e. a cloud is (essentially) just a managed server... and so the data and applications that you put in it are only as secure as the security controls that you place upon them.

So can we dig deeper?

Companies such as Progress Software are championing 'enhanced' multi-tenancy as the killer development factor for cloud security.

As such, the firm has labeled the version 11.1 iteration of its Progress OpenEdge cloud application development platform as capable of delivering stronger security coupled with simplified user authentication.

How does it do what it says on the tin?

Progress has one of its customers on record to explain.

Phil Jones, VP of support and development at Bluebird Auto Rental Systems has clarified that the "enhanced multi-tenancy capabilities" in the latest release of the OpenEdge platform will "dramatically reduce" his administration overhead and improve application performance.

But, crucially, all this is done while keeping customer data physically separated and secure.

openedge-tour.jpeg

"The Progress multi-tenant database allows us to organise our data into regions to better align with the needs of our customers for improved reporting, streamlined maintenance, and simplified support activity," said Jones.

Colleen Smith, vice president, SaaS, Progress Software, commented: "Before companies launch business-critical applications into the cloud, they expect the same assurances of data security and compliance that they have long expected from applications run on-premises in their own data centres."

But what is enhanced multi-tenancy?

It's hard to perform a search on enhanced multi-tenancy and not come back with one vendor name: Cisco.

More specifically that should be three names as Cisco, VMware and NetApp have worked together to jointly designed what they like to call a best-in-breed Enhanced Secure Multi-Tenancy (ESMT) architecture.

Deploying the Cisco Nexus 7000.png

You can read an entire PDF on this subject here, but in brief... Cisco's technology proposition is one of "defence in depth"

i.e. secure separation within data architecture implemented at all layers and within devices.

According to Cisco, "The Enhanced Secure Multi-Tenancy architecture supports enterprise applications from server to the desktop or virtual desktop. The architecture scales up and down as needed. It also meets the performance, availability, automation, and security service-level requirements of individual applications required to deliver IT as a service (ITaaS)."

According to Rackspace, "Public clouds are fundamentally multi-tenant to justify the scale and economics of the cloud. As such, security is a common concern. Whereas the traditional security perimeter is a network firewall, the cloud security perimeter now becomes the hypervisor and/or underlying cloud application. Thus far, security in the cloud has been good, but this is very cloud dependent and requires a solid design and operational rigor that prioritises security."

... and the thought for the day?

Considering the supposed depth and penetration of cloud, the amount of industry discussion and analysis on this subject is comparatively scant.

How do we make video collaboration ubiquitous?

bridgwatera | No Comments
| More

Unified communications company Polycom is aiming to make video-based collaboration technologies "even more" ubiquitous across major industry segments.

The open standards focused unified communications (UC) company wants to see more video collaboration for governments and organisations focused on healthcare, education and manufacturing -- as such, it is releasing a new suite of open APIs designed to facilitate the development of custom-built applications.

But what does it take to make video collaboration ubiquitous?

It's all about interoperability of course; that way we can all talk to each other.

Actually, to be honest -- it's all about interoperability, scalability, manageability, and secure delivery. But you get the point.

Polycom's gambit in this regard is its RealPresence Platform for universal video collaboration. The firm's new developer-focused release of APIs and software development kits (SDKs) is hoped to provide programmers within enterprises and organisations to tailor applications to integrate with Polycom RealPresence video solutions such as provisioning, scheduling, conference management, billing and resource reporting.

Anti-Virus company McAfee is said to be using this technology to engage multiple globally distributed engineers to support its most critical job cases.

"With the advent of the new software updates to the platform, especially the open APIs, we envision being able to far more easily and cleanly integrate the software we use to support our customers," said David Piekarski, senior manager, global telecommunications operations at McAfee.

Video also needs -- scheduling, management... and provisioning.

As well as a new online developer community for API support, Polycom is also now providing its RealPresence Resource Manager for scheduling, management, and provisioning. This technology works to manage up to 10,000 devices.

"The RealPresence Resource Manager makes it easier for IT administrators to manage their organisation's video collaboration network and support up to 10,000 devices including room and immersive systems, tablets, smartphones, laptops, and executive video desktops," said the company, in a press statement.

polycom.png

"As more organisations realise the benefits of collaborating face-to-face by video from anywhere - benefits such as increased productivity, improved engagement, and reduced operational and travel costs - they want to fully incorporate video into their business processes and workflows to make usage as easy as possible," said Sudhakar Ramakrishna, president of products and services, Polycom.

"Our new suite of APIs for the RealPresence Platform will provide this integration and give customers greater flexibility to innovate as they further build out their video collaboration networks to drive significant business value. At the same time, we're delivering more scalability and manageability to IT administrators who are looking to video-enable their organisations."


The future of mobile is cloud-based & server-side

bridgwatera | No Comments
| More

The fabulous and free TED (Technology Entertainment Design) application for iPad is (I would like to argue) a great piece of software, which, once installed, allows the user to view a selection of superbly informative and thought provoking videos from the now world-famous "multidisciplinary" TED conferences.

Plus, a user can select the "My Talks" function and download (a la Podcast style) a number of talks to listen to once out of WiFi coverage.

Although this app does rely heavily on its back end data to "feed" mobile devices with the data required to play the video, its "on device" functionality may be regarded as comparatively high...

... especially when we now move to a more cloud-based server-side mobile application world.

TED ipad.jpeg

So on this train on thought -- mobile cloud application company FeedHenry chose this month to release a Yankee Group whitepaper designed to analyse the mounting complexity of mobile apps.

The firm found that what cloud-based server-side really means today is a sea-change (or paradigm shift if you prefer) towards mobile Backend-as-a-Service (mBaas) technologies.

A recent news story on this subject concluded that, "If indeed mBaaS has already become an industry standard acronym, then we should understand this term to refer to the work of software application developers who are looking beyond lightweight applications to building more complex ones requiring greater integration with backend systems."

This apparently new and much more concentrated focus on server-side infrastructures (particularly as software developers' tools are also moving INTO the cloud) means that we may now see more importance put on not only reusable code, but also preconfigured mobile app services and over-the-air updates.

What does this all lead to?

Once again, it's mobile Backend-as-a-Service.

For further justification in this area we can look at research released this week from Rackspace in association with CityIQ, which has studied the investment in mobile computing capabilities by UK banks and financial institutions.

According to Rackspace, providing customers with mobile apps is now viewed as "being crucial to having a competitive advantage" by 63 per cent of financial services organisations.

The development of apps such as Barclays' Pingit is an example of how use of technology can be exciting and engaging to the customer says the firm; it also demonstrates the banking sector's wider progressive attitude and desire to dominate the mobile payments industry.

"The increasing importance of mobile in banking and finance can be attributed to the explosion in customer demand and expectations. The greater availability of cloud-based delivery mechanisms has made it feasible for mobile solutions to be developed and rolled out quickly and securely," said Fabio Torlini, Rackspace VP of cloud.

Torini continues, "Cloud computing is considered by the overwhelming majority of respondents [in our study] as having a significant part to play in developing their mobile strategies, with 78 per cent agreeing the cloud is an enabler of increased mobility and 43 per cent stating their organisation is either using or planning to use it."

The future is bright, the future is mobile, the future is cloud.

Free iPad app for secure enterprise-level single sign-on, really?

bridgwatera | No Comments
| More

Programmers working at the coalface of cloud services development just now will no doubt have the words "secure access to corporate data via mobile" ringing in their ears at some point.

For some time now the emphasis has been on methods to develop and refine secure remote access and (if you have the appetite for some more intense industry phraseology) so-called "robust remote synchronisation to the corporate data centre"... from smartphones to tablets to laptops to other embedded computing devices.

What we obviously need here is a free iPad app for secure enterprise-level single sign-on.

Could OneLogin's eponymously named OneLogin for iPad be the solution we are looking for? The company claims to have provided enterprise-class identity and access management for private and public cloud applications.

"Our industry-first iPad app has mobilised one-click access to any web application, securely and at no additional cost," said Thomas Pedersen, chief executive officer of OneLogin.

"Before today, mobile users were forced to deal with watered down versions of applications, and the cumbersome process of entering information on the iPad hampered both productivity and adoption. Today, we can take virtually any SaaS application and make it immediately useful on the iPad."

OneLogin uses multi-factor authentication for security and the app also allows for identity-driven iPad security, allowing IT executives to address bring your own device (BYOD) security concerns by managing access down to the user and application level.

onelog.jpeg

"Mobile device management (MDM) has done a good job of securing the first wave of mobile devices entering the workforce, but can often fall short in providing capabilities that secure the applications themselves," notes Eric Ahlm, research director with Gartner, Inc. "We are seeing a trend of companies struggling with the 'next app' phenomenon asking Gartner how to secure a broader set of applications on mobile platforms beyond email."

According to Gartner, Inc. Apple's iOS is predicted to remain the dominant media tablet operating system around the world in 2012 with 61.4% sales to end users.

Balancing apps against cloud storage limitations

bridgwatera | No Comments
| More

This is a guest blog post by Jeremy Thake, enterprise architect and Microsoft Sharepoint MVP at Avepoint. In this guest piece for the Computer Weekly Developer Network below, Thake discusses the importance of balancing application functionality against the storage parameters associated with SharePoint.

On 28th June 2011, Microsoft released a significant upgrade to its cloud services portfolio - Microsoft Office 365 - an online collaboration and productivity tool, which includes Microsoft SharePoint Online. While on-premise versions of SharePoint 2010 still offer a richer feature set than its online counterpart, many businesses are looking to take a hybrid approach to SharePoint that utilises cloud as well.

With this in mind, there are certain considerations that businesses must make to optimise SharePoint, whether that be online, on-premise, or a combination of the two. One consideration is how to balance your applications against the limitations of the cloud storage model, whether it is through Office 365 or cloud-hosted storage.

While SharePoint Online offers core collaboration and document sharing tools, there are some limitations when relying on an internet-only version without the support of on-premise infrastructure. For businesses wanting to deploy more complex scenarios for application development or enterprise content management, SharePoint Online might not be the most economical or effective option due to storage restrictions and the associated costs with exceeding those parameters.

Without proper planning, businesses can easily find themselves storing unnecessary, unused data in the cloud. SharePoint Online is bound by specific limits in terms of storage. It comes with 10 GB of storage per tenant and 500 MB of shared storage per user. If more storage is needed, it can be purchased at an additional data charge per gigabyte (GB), per user, per month.

Sandbox serendipity

While storage and bandwidth are key considerations for businesses using SharePoint, the kind of applications that businesses choose to host on collaboration platforms can also determine whether they are best suited to an online or on premise environment. As a multi-tenant model, businesses will find that the only form of custom coding supported by SharePoint Online is sandboxed Solutions.

NOTE: Sandboxed solutions enable permitted users to upload custom solutions through the web interface based on technologies such as Silverlight, jQuery and Client Object Model. However, these solutions are limited by security restrictions and governed by the farm infrastructure, whereas the Full Trust solutions - which are only available on premise - have fewer limitations in terms of functionality, scope or code access security.

Nevertheless, the scalability of the cloud platform coupled with the quick set-up and low cost of start up means that SharePoint Online is perfectly suited for SharePoint development, testing and proof of concept. By using SharePoint Online as the development and testing platform, businesses can evaluate and experiment with applications before opening them up to the masses and incorporating new tools into their on-premise environments.

Whether businesses decide to use SharePoint Online, on-premise, or a mix of the two, they must ensure that they have carefully considered the limitations and restrictions surrounding storage and application development.

Businesses can easily find storage costs spiraling out of control if they do not acknowledge the storage restrictions and associated costs with exceeding storage parameters. As well as the storage limitations associated with SharePoint Online, businesses must also contemplate what applications they require. If it is a necessity to run Full Trust complex applications, then SharePoint Online will not facilitate this ability.

Business reality

Many businesses are looking for a balance between the fast set up, scalability and flexibility of the cloud coupled with the full features and complex application solutions which are only available in on-premise environments...

... with this in mind we will see more businesses taking a hybrid approach as they look to balance application functionality against the storage parameters associated with SharePoint.

To kill a (non) mocking Agile software application developer

bridgwatera | No Comments
| More

As Agile software application development has become more popular in recent times we have (by no coincidence) logically seen the rise of extended code analysis tools in the Agile space.

The concept is simple. With a deeper level of granularity into extended code analysis, the pursuit of better debugging is made easier -- even in rapidly shifting Agile development scenarios.

NOTE: TechTarget's own definition of Agile software (application) development (ASD) states that: This is a methodology for the creative process that anticipates the need for flexibility and applies a level of pragmatism into the delivery of the finished product. Agile software development focuses on keeping code simple, testing often, and delivering functional bits of the application as soon as they're ready. The goal of ASD is to build upon small client-approved parts as the project progresses, as opposed to delivering one large application at the end of the project.

Software testing specialist Typemock says that the key to this "deeper level of granularity" is isolated unit tests. The company highlights findings which estimate that although 70% of Agile developers say that they unit test, much less of them really write automated, isolated unit tests.

Since isolation requires "mocking", developers are not testing their software as thoroughly as they might, says Typemock product manager Gil Zilberfeld.

NOTE: Mock objects mimic the behaviour of real objects. In a unit test, mock objects can simulate the behaviour of complex, real (non-mock) objects. Mock objects are useful when a real object, such as dependencies, is impractical or impossible to incorporate into a unit test.

versions.png
Image Versionone

"When your unit tests use mocks, they run faster. Instead of expensive calls to the database or the cloud, you can shorten tests to milliseconds. A full test suite can run in seconds instead of hours, giving you immediate feedback on your code," said Zilberfeld.

According to Typemock, developers who use its methodologies and technologies will find that when a unit test fails, the resulting scenario is easier to handle.

"They don't need to start up the entire application, set a breakpoint, miss it, return to it, shut it down, fix it, and repeat until it's really fixed -- as unit tests (typically) test very small pieces of code, where the rest is mocked," said the company, in a press statement.

Help! my boarding pass has no software integration

bridgwatera | 1 Comment
| More

Having just flown transatlantic with a well-known American carrier that shall remain nameless...

... I sometimes wonder how they justify operating what should be pretty seamless systems to look after our bookings, when things can go so wrong.

The system managed to book my wife and I into completely separate seats, which we had to constantly go back in and amend on more than one occasion.

As passenger bookings go up and down, plane sizes are changed (that's fine - we know this happens), but for my chosen airline this meant that my wife and I were placed on either sides of the plane in different seats again.

We do have the same last name surname and we shared one single booking reference, it can't be that difficult.

So if Sabre, Amadeus and/or Galileo systems are being used here (and I'm pretty sure that they are) I would suggest that it is not the software's fault, but the fault of those Systems Integrators (SIs) and internal developers and operations staff whose job it is to ensure that these systems do work in an INTEGRATED manner, which they clearly are not -- not at every level at least.

Oh and the name of that airline?

I couldn't possibly divulge...

... but it's the second part of a world famous soccer team's name from Manchester, it comes before Kingdom, Arab Emirates and States and it rhymes with "delighted" - even though I'm not.

United.b777.arp.750pix.jpg
Free Image Use: Wikimedia Commons

Subversion to GIT; a change in direction?

bridgwatera | No Comments
| More

This is a guest post to the Computer Weekly Developer Network by Chris Lucca, technical evangelist for software configuration management (SCM) and Agile ALM company AccuRev.

Git is undoubtedly attracting more and more fans because it is fast, flexible, simple to learn and rich in developer-friendly features. As an increasing number of developers are applying pressure to make the move to Git, this article provides high level guidance to the management team on the implications of such a move on an enterprise and why the leadership team should care.

If software configuration and change management (SCCM) tools had never existed, then we may arguably have gone on interminably without the speed and efficiency benefits that version control systems bring to bear upon modern programming environments.

But software application development shops of all sizes have always known that they need to create, maintain and uphold performance and control over the code base at all times if they are to meet deployment targets and user requirements -- so we might argue that a certain inevitability has always been at work in this space.

So SCCM (or revision control if you prefer) was always destined to organically evolve out of the collective need to provide technical source code management functionality. As the use of collaborative tools from Post-It Note reminders and then to emails and onward to wikis and instant messages have developed, the globally distributed nature of software teams has also expanded -- and this has made these tools more important every day.

So version control grew and leant to walk up straight and eventually many developers grew tired of proprietary tools in the space, impressed by their enterprise-class power but held back by either their lack of speed or inflexibility - or indeed both. Now as Apache Subversion and Linus Torvald's Git exist today, we know that we have a choice when it comes to open code commit and control software -- so which way should we turn when we need to make this trade-off?

The trade-off

Actually it's not so much a trade-off between Git and Subversion; each has their relative virtues and distinguishing characteristics, but it is the overall functionality which will now dictate both project's longevity as programmers chose one system over another.

As with almost all strains of open technology, at a deeper enterprise grade deployment level we find the extended ground of commercially supported iterations. The ability to offer the power and flexibility of a tool like Git but augmented, enhanced and enriched with the necessary levels of security, auditability and development process visualisation are what drives the current bleeding edge of SCCM technology today.

The facts are that there are (yes, commercially supported and therefore coming at a cost) routes to improved functionality with Git in this space. But this is a cost that many customers will be able to identify that they need to invest in (alright, I mean "paying for") because they can also see a route to leveraging (alright, I mean "using") more powerful process management.

Organic "extensions" to open source

These are the kind of natural extensions which will always naturally and organically come about out of an initially open source project once it moves into the commercially supported realm purely because of factors which I can only put down to the laws of economics and natural selection, i.e. they simply work better so financially accountable firms see these options as route to increased return on investment and ultimate profits.

I could go on here. The benefits are multifarious, manifold and (in my not so humble opinion) quite marvelous. Substantial tangible real world benefits exist in extended iterations of the Git universe such as the option for developers to interact using collaborative tools before code commits are made - it's always good to talk right? There are also options for workflow management and deeply integrated issue-tracking, software development process visualisation - the list goes on.

In the dying days of the Sun Microsystems empire before Oracle came to town company CEO Jonathan Shwartz was often known to say that Solaris and the tools were all there and were all free - but when people needed service and maintenance, Sun would be there to support them. Extended, enhanced, supported versions of Git follow something of the same ethos i.e. a higher path is attainable for businesses that need it.

The PC is dead; long live the 'cloud monitor'

bridgwatera | 4 Comments
| More

Will PC's (as we once knew them) die out now that the cloud computing model of service-based software application delivery and virtualised data storage and management has taken hold?

Are we one step away from referring to computers not as computers, but as "cloud monitors", where the only 'Windows' on show is a window to the software available from our chosen hosting provider?

We have (largely) already reached a point when we no longer install software off-the-shelf and out-of-the-box onto our machines, as online downloads have become the norm.

This "online download trend" although perfectly acceptable at the PC (or Mac etc.) level is further fuelled by the fact that users are estimated to have somewhere around four to five times the number of "self-installed" applications on their mobile devices (smartphones and tablets in this case) than they do on their desktop machines.

The End of Software -- as we know it

Does this all play out well with Salesforce.com's CEO Marc Benioff who defined the mission for his company as a quest to bring about -- The End of Software, as we know it?

The Force.com platform itself rests and revolves around a Customer Relationship Management (CRM) application suite, although the company has used the last decade and a half to expand its reach and scope into what appears to be labelled as "social enterprise" applications.

So how will Salesforce.com bring about the end of software?

The company's strategy allows the opening up its infrastructure so that "everyone" can use it for custom application development, "With salesforce.com's Force.com cloud platform, you can build any business application and run it on our servers," says the company.

force.jpg

But is the software application world really "beating a path" to the Force.com platform as Benioff claims?

Anyone who has attended a JavaOne/Oracle Develop event with CEO Larry "did I tell you I won the America's Cup" Ellison will tell you that Benioff gets a few pot shots taken at him at every keynote i.e. not "everyone" is completely sold on the whole "develop for the cloud only via Force.com" message yet... but some of the momentum may indeed be gathering.

Benioff contends that "traditional" software application development has "too many moving parts" to buy, install, configure and maintain -- and that, moreover, the "entire infrastructure" requires constant maintenance to keep it working smoothly.

These anti-Agility messages will clearly not win fans with the open source community who may indeed take umbrage at Salesforce.com's dismissal of other systems as a, "Welter of unintegrated, homegrown systems on spreadsheets, personal databases, or other unsupported platforms."

But will developers start to view cloud platforms such as Force.com as a new route to custom (they're American - they mean "bespoke") application development on the cloud?

There is certainly appeal for being able to get all your business logic with workflow rules plus approval processes from one central hub. One might even argue that robustness for both availability and security could be improved via this route.

Access to Force.com is through a web browser so development and deployment both take place in the cloud. "The platform itself provides everything you need for robust enterprise application development through a combination of clicks, components, and code," says the company.

So where do we go from here?

Changes are afoot, but don't throw away your PC or Apple Mac just yet please.

Subscribe to blog feed

About this Archive

This page is an archive of entries from July 2012 listed from newest to oldest.

June 2012 is the previous archive.

August 2012 is the next archive.

Find recent content on the main index or look in the archives to find all content.