September 2010 Archives

AVG launches Internet Security Suite 2011, but what about the Bayesian probability factor?

bridgwatera | 2 Comments
| More

So last night I was able to meet and share canapes with AVG CEO J.R. Smith and the company's CTO Karel Obluk to celebrate the launch of the new AVG 2011 Internet security suite. What started out looking like a back slapping meet-and-greet did in fact turn into a deep dive technology update and a mathematics lesson in Bayesian probability.

Amidst the crab balls on spoons, miniature vegetarian tacos served by effeminate waiters and the glitzy gloss of a product launch you can sometimes, if you are lucky, get down to the nitty gritty of why a company has gone to market with a new product version. This was my mission...

AVG says its 2011 iteration features enhancements based on community feedback from the company's global community of more than 110 million users and now includes enhanced web and social network-protection.

But what kind of feedback is this? What kind of enhancements have been made -- and is AVG's so-called People-Powered Protection technology and approach anything more than marketing puff?

I had a similar problem when I attended the launch of Adobe Photoshop Elements last month. I asked the speaker how the company had re-engineered and re-architected the product to extract it and simplify it from the total Creative Suite 5 offering. I got a, "we'll get back to you on that" - and I'm still waiting.

This was not so much the case last night. AVG does seem to take the back end seriously enough to bring its CTO along to product launches and this guy is a programmer of the old school. Karel Obluk took me through software kernels, re-architecting modules to handle zero-day attacks, how automatic updates are engineered at the back end, how the analysis labs work inside an anti-virus company and where I should look for to get the best beer in his home city of Prague.

The concept is simple, or at least it ought to be. If you want developer/programmers to adopt your product and actually use it- then they are going to need to know more than whether or not it comes in a shiny new box.

AVG Jap.jpg

AVG's People-Powered Protection appears to be a system where users decide to opt in or opt out of sharing data related to the websites (both safe and potentially malicious) that they visit. Aggregating this data and then feeding it into analysis engines fueled by (among other values) Bayesian probability logic where reasoning can be theorised on the basis of 'uncertain' statements - the company says it builds new Internet security power.

Without going into any further analysis of Bayesian probability and the state-of-knowledge concepts of objectivist views versus subjectivist views, there is still a lesson here I believe.

Of course, AVG does have its free to download LinkScanner technology which analyses website content before a user's browser is directed onward - and it also has its AVG Free model, which, arguably, also brings in a good quotient of faithful converts to the AVG way.

But if you want to sell to your product to families and non-techies then fine, keep your messages pretty much as they are. If you want to seed power users from the top down who will understand why your product is better (if it indeed is) - then drop in some Bayesian probability theory and a CTO briefing. Otherwise, those canapes better be really good.

The "Offline" Web Application

bridgwatera | No Comments
| More

This is the third blog post by Mat Diss, founder of, a UK mobile Internet software company that is focused on developing new ways for web designers to construct better websites that can be delivered across all platforms.

In this blog, Mat looks at the offline support HTML5 delivers for web applications...

A key benefit of native applications (whether desktop or mobile) is the ability to interact with the application whilst you are offline. The HTML5 offline support allows web applications to achieve this. Google has demonstrated its backing for the HTML5 offline support by announcing that they are no longer supporting Google Gears - an early solution for for offline web access - and are backing the HTML5 offline APIs.

So along with strong support from the major browsers, this is an indication that this API will mature and become an essential foundation for the web. Offline storage will be an essential ingredient of any web application that requests information from a user and delivers essential information that a user would want to access anywhere, anytime.

For example when a user fills in a form, or edits some data it is an important aspect of the user experience that the information entered is not lost and causing frustration. With HTML5, changes can be stored locally in the browser and synced with the main server when a connection is re-established.

Information applications - such as travel guides - also provide much more value if you can access information quickly and reliably, even if you are in the tube, on a plane, or in a foreign country on an expensive data plan.

The example below implements a simple local 'to do' list manager:

function storeData() {
var newDate = new Date();
var itemId = newDate.getTime(); //creates a unique id with the milliseconds since January 1, 1970
var todo = document.getElementById('todo').value;
localStorage.setItem(itemId, todo);
} catch (e) {
alert('Quota exceeded!');
var todo = document.getElementById('todo').value = "";

function getData() {
var todoLog = "";
var i = 0;
var logLength = localStorage.length-1;
//now we are going to loop through each item in the database
for (i = 0; i <= logLength; i++) {
//lets setup some variables for the key and values
var itemKey = localStorage.key(i);
var todo = localStorage.getItem(itemKey);

//now that we have the item, lets add it as a list item
todoLog += '


//if there were no items in the database
if (todoLog == "")
todoLog = '

Log Currently Empty

document.getElementById("todoList").innerHTML = todoLog;

function removeItem(itemKey) {

function clearData() {

window.onload = function() {

Which can be used from the following page:

I've got to ...

Enter something you have to get done

My to do list...

Nothing to do

Note that once this application has been downloaded by the user it can be used offline (i.e. without a network connection). In a full application you would probably want to persist these changes with a back end server when the user does come back on line which can be done by hooking into the on-storage event handler (see

Michael Dell talks 'Zettabytes' at Oracle OpenWorld

bridgwatera | No Comments
| More

This enormous IT show started off today with a presentation from Michael Dell. The man himself kicked off with an overview of the partnership that has existed between his company and Oracle since 1995. This week has seen the launch of a new Oracle consulting and implementation practice at Dell, so it's fairly safe to say that the two companies are snuggling up closer than ever.

Dell went on to detail a comparatively new measure of data in terms of our general awareness, the Zettabyte. A Zettabyte is 1.1 Trillion Gigabytes and studies suggest that by 2020, we'll all be creating about 20 Zettabytes per year and about one third of that data will pass through a cloud.

Dell went on to talk about servers, storage, networks and services. He lent towards talking about how Dell works at the application level by detailing several large implementations of Dell hardware and services by bringing out customers including Zynga Games (the people behind Farmville) and FedEx too.

Dell is pointing to 'next-generation' data centres that will power the new ways we handle data over the next couple of decades - clearly he wants to make sure that his company's servers are well positioned to drive these data banks - and it would be hard to argue that his relationship with Oracle will harm that objective.

The newly enhanced Dell Services Oracle Practice is designed to drive cost and complexity out of IT infrastructures. The company says that, "Dell Services also helps companies simplify data center operations with assessment, design and implementation services that focus on Oracle Database, including Real Application Clusters, in a standards-based x86 environment. These services are designed to enhance data availability while lowering the total cost of ownership."


Given Oracle's own prowess in database hardware and the newly launched Oracle Exalogic Elastic Cloud, it's kind of hard to know where Oracle stands on hardware from a corporate policy perspective. On the one had, Oracle says that it buys a lot of Dell kit. On the other hand, the Exalogic box is described as the world's first and only integrated middleware machine - or to use Oracle's own words, "A combined hardware and software offering designed to revolutionise data center consolidation."

Either way, if Dell's Zettabytes predictions are true then we're going to see a lot of software application developers and database administrators being kept very busy in months and years ahead.

Larry Ellison: how should we define the cloud?

bridgwatera | No Comments
| More

If you've been following the news for Oracle OpenWorld and JavaOne 2010 so far you'll have seen a whole heap of products unveiled. The company has quite literally carpet-bombed the newswires with some pretty meaty announcements. There is therefore, I hope, room for some slightly more tangential comments as to the look and feel of the show.

Larry Ellison himself used the first keynote (unusually hosted on a Sunday night) of this event to detail his all-singing, all dancing integrated hardware and software box -- the Exalogic Elastic Cloud. The concept behind this product being that it is all the hardware, middleware, processing power and software needed to create a cloud infrastructure.

Under the new tagline 'Hardware and Software Engineered to work Together', Oracle is rolling out what it claims to be a total cloud solution - but is it simply, as some commentators have suggested, nothing more than an 'intelligent server'?

Is this announcement no more than "Cloudwashing" i.e. putting the term "cloud" in front of some otherwise standard piece of kit - or it is true innovation?

Could this really be the cloud in a box - delivered?

Larry 1.jpg

Looking at the hardware on offer here. We can see how Larry defines the cloud.

"The whole idea of cloud computing is to have a pool of resouces that is shared by lots of different applications within your company - at end of month accounts, the resouces are directed towards the finance department, when it's time for a sales drive, a different department gets what they need and releases the power that they don't need," said Ellison.

If the Exalogic Elastic Compute Cloud is indeed a cloud in box, then this is what it takes to build one:

  • 30 servers in a box - 360 cores
  • An Infiband Network - so the servers can all talk to each other (but with a lot of bespoke aditional work from Oracle)
  • A storage device
  • A VM with 2 guest OS - one is Solaris for high end, and one is Linux
  • A strong middleware component

... and the secret sauce? Well, that's the "coherence software" that conjoins the memories in all of the servers so that they appear as one complete unit.

OK so every tech vendor worth its salt has their own spin on what cloud computing really is - and Larry clearly had a riot of a good time running down the efforts of and Red Hat as he specified that Oracle's defininition of the cloud - if he ABSOLUTELY HAD to pick another company - it is probably the Amazon (EC2) - elastic cloud 2.

For Larry, the cloud must be a "platform" upon which you build other applications, which includes both hardware and software and it's all standards based - and the on demand element is especially impotant here, so much so that Amazon put the term ELASTIC into the term itself (which Larry actually quite likes) - after all it was Amazon that really popularised the term.

Is your head still in the clouds? Spare a thought for the attendees that had to sit through a 10 minute gung-ho boys at sea video of Larry winning the America's Cup in his massive sailing catamaran.

Tonight, all the UK press are going out for a few beers with him while he asks us to tell us what we like most about open source computing and the community contribution model of software application development before we head on out for burgers.

OK - so that last bit didn't happen, but all the above did!

Oracle OpenWorld: paving the way for Solaris 11

bridgwatera | No Comments
| More

Oracle has used its Oracle OpenWorld and JavaOne exhibition in San Francisco this week to outline the next major release of the Oracle Solaris operating system. The company says that it is paving the way for the next version of this enterprise level OS by releasing Solaris 11 Express, to provide customers with access to the latest Solaris 11 technology.

The company says that the first Oracle Solaris 11 Express release, expected by the end of calendar year 2010, will provide customers with timely access to the latest Oracle Solaris 11 Express features with an optional Oracle support agreement.

This release is intended to be the path forward for developers, end-users and partners using previous generations of Solaris.


The full version of Oracle Solaris 11 is expected to increase application throughput, exhibit improved platform performance and maximize overall platform reliability and security.

Here at the show itself, rumours suggest that this is some of the first evidence of the wider Oracle technology stack being brought to bear upon Solaris - and that this is positive joint engineering and integration.

When it does arrive in its full blown version, Oracle Solaris 11 is scheduled to contain more than 2,700 projects with more than 400 new so-called "inventions".

"Oracle Solaris 11 is now increasing system availability, delivering the scale and optimisations to run Oracle and enterprise applications faster with greater security and delivering unmatched efficiency through the completely virtualised operating system." said John Fowler, executive vice president, Systems, Oracle.

Oracle says that Solaris 11 will virtually eliminate patching and update errors with new dependency-aware packaging tools that are aligned across the entire Oracle hardware and software stack to reduce maintenance overhead.

The advantages of the Geolocation API

bridgwatera | No Comments
| More

This is the second post by Mat Diss, founder,, a UK mobile Internet software company that is focused on new ways for web designers to construct better websites that can be delivered across all platforms.

In this blog, Mat, looks at the advantages of the Geolocation API...

The Geolocation API already brings with it the possibility for the user to share their location. By design, this is a user choice on a site-by-site basis - choosing to share their location when they feel they'll benefit and when they trust the service provider. This opens web sites up to the opportunity of location base services, such as finding people or places nearby, combined with mapping services this can all help to create a more pleasurable experience with a service provider - saving time and shoe wear.

You can locate a user by including the javascript below in your page

and then including a find me button like,


This example redirects the user to a new URL with the latitude and longitude populated, but you could equally do this with a AJAX call back.

Visit the demo site at:
View Video file: bemoko html5.gelocation.wmv

HTML 5.0 tutorial: the advantages of native media support for delivering video

Cliff Saran
| More

This is the first post in a tutorial by web application specialist Mat Diss, covering the new features in HTML 5.0.

HTML5 brings native media support to the browsers.  There has been much fragmentation in the format of video and audio that is required for web delivery. Historically this also came with a lack of control for how the media will be displayed (e.g. embedding in a page) and the requirement for extra plugins. This makes it more costly for service providers to deliver media and makes media experiences less than seamless for the end user. By standardising the media support within a web environment this fragmentation can be brought under control, making video deliverable easily accessible to all and make experience more pleasurable for the user.


There is still the ongoing battle between the Ogg, H.264 and WebM video codecs.  H.264 brings widely accepted improved video delivery, but in a propriety format.  Google created WebM to provide an open standard to compete with H264.  MPEG LA in response have recently announced that it will indefinitely extend the royalty free use of h264 for free web content. With strong and passionate arguments on either side.  Standardisation brings with it significant performance benefits with device manufactures bringing standard video codec processing into hardware instead of software which can lead to over twice as much battery life.

Mat Diss, is founder of bemoko, a UK mobile Internet software company pioneering new ways for web designers to construct websites that can be delivered across multiple platforms.

IBM predictive analytics software helps save Grévy's zebra

bridgwatera | No Comments
| More

IBM isn't just a big old behemoth of an IT company concerned with databases, Rational software application development processes, massively powerful Z-series servers, proprietary technology leadership, autonomic computing mechanisms and open source innovation y'know?

It cares about the Zebra too I'll have you know.


Big Blue execs were in London this week to talk about how the company's predictive analytics software is playing a key role in helping conserve wildlife in northern Kenya. Working with conservation charity Marwell Wildlife, IBM hopes to help secure a future for Grevy's zebra, an endangered species with less than 2,500 individuals in the wild.

Winchester-based Marwell Wildlife is conducting a survey on Grevy's zebra where Kenyan nomadic herdsmen are interviewed about the animal, what they think the key threats facing them are and where they think they are located. The herdsmen have very good wildlife knowledge and interviewing them is a very efficient means of collecting information over this vast and inaccessible area. The charity then uses IBM predictive analytics software to help identify patterns and analyse data, which will help inform decisions on conservation measures to be taken.

"The IBM predictive analytics software is critical in analysing the information we collect from the field. The data from the surveys is vast and complex and requires powerful software to analyse it. The software is ideal for identifying trends and patterns from this data," said Dr. Guy Parker, head of biodiversity management at Marwell Wildlife.

"In the case of the recent interview survey, the software enabled us to determine peoples' attitudes towards the Grevy's zebra. Furthermore, we were able to determine what influence factors such as education level, age, location, and wildlife benefits had upon peoples' attitudes. This is the kind of complex multi-variate analysis that the IBM predictive analytics software is designed to tackle," added Parker.

This analytics solution is powered by IBM SPSS predictive analytics software.

IBM Predictive Analytics at Marwell Wildlife

From sand to silicon: Intel Developer Forum 2010

bridgwatera | No Comments
| More

Intel likes to talk about the future - and the company has been pretty adept at playing the 'industry visionary' ever since Andy Grove wrote his seminal book Only The Paranoid Survive back in the nineties.

It should come as no major surprise then that the company has been talking about building a foundation upon which to build common hardware, software and ecosystem solutions since its Beijing Developer Forum, which was held in April of this year.

The Computing Continuum?

Trying hard to coin a new industry phrase or catch line, Intel would like to see its processors and supporting technologies proliferate across devices so that a common and connected computing experience is made possible - "across the computing continuum" as the company puts it.

Intel points to the prediction that there will be an additional 1 billion connected computing users by 2015 and that with more types of devices there is great value in providing a common experience between the devices.

Of course it's no major revelation to hear Intel say this as this is the common mantra shared by many industry vendors, but given the company's array of 'touch-points' to so many devices - it would be worrying if we weren't hearing it. For Intel, a seamless cross-device experience means desktop computers, mobile devices and also data connectivity inside cars and throughout the home. The message is "consistency and accessibility" to information.


So moving forward to the company's USA Developer Forum, which is being held this week in San Francisco, the news is focused on the company's 2011 2nd Generation Intel Core processor family and the new opportunities that are being created for developers through better chip performance, battery life and a number of visually related features built into the chips themselves.

Sandy Bridge

Codenamed "Sandy Bridge," the company says that its latest chips will be based on its first new "visibly smart" micro-architecture produced at 32-nanometre. For the record, one nanometre is a billionth of a metre - and visibly smart is Intel's marketing term for on-processor graphics capabilities.

"The way people and businesses are using computers is evolving at an explosive rate, fueling demand for an even more powerful and visually appealing experience," said Dadi Perlmutter, executive vice president and general manager of the Intel Architecture Group. "Our upcoming 2nd Generation Intel Core processor family represents the biggest advance in computing performance and capabilities over any previous generation. In addition to offering these features inside Intel-based laptops, we plan to scale these advances across our server data center and embedded computing product portfolio."

Intel says that its new processor family will include a new "ring" architecture that allows the built-in processor graphics engine to share resources such as cache, or a memory reservoir, with the processor's core to increase a device's computing and graphics performance while maintaining energy efficiency.

Laptops and PCs powered by the 2nd Generation Intel Core processor family are expected to be available early next year.

Being Geek: you've seen the movie, now read the book

bridgwatera | No Comments
| More

Twenty years ago if you were labeled a geek it was somewhat, well - geeky. Thus the term was coined in its initially negative context. Subsequently it became "chic to be geek" and the American electronics store Best Buy now uses its Geek Squad brand to sell customer support in a warm and fuzzy context.

The 1976 edition of the American Heritage Dictionary is said to have included the following definition of the word geek:

This word comes from English dialect geek, geck: fool, freak; from Low German geck, from Middle Low German. The root geck still survives in Dutch and Afrikaans gek: crazy, as well as some German dialects, and in the Alsatian word Gickeleshut: geek's hat, used in carnivals.


So now that the term is firmly embedded into our consciousness, it is perhaps only fitting that a software engineer has written a book to detail the highs and lows of a life in Silicon Valley working for some of the industry's biggest players including Apple, Netscape and Symantec.

"As a software engineer, you recognise at some point that there's much more to your career than dealing with code. Is it time to become a manager? Tell your boss he's a jerk or join that startup?" asks author Michael Lopp in Being Geek by O'Reilly Media.

"I wrote this book using more than 40 standalone stories because I want geeks and nerds alike to own their careers. It's too easy in an industry full of urgent things to do to forget that you're going to have many jobs and you get to choose where those jobs will take you," explains Lopp.

This book has some compelling-sounding chapters including: "The Business" (to help decide what you're worth), "How Not to Throw Up" (advice on giving presentations, "Managing Werewolves" (how to handle liars and people with devious agendas) and "The Itch" (how to realise when you should be looking for a new gig).

Texan "cloud calculator" aims to tot-up the cost of IT transformation

bridgwatera | 2 Comments
| More

Texan cloud computing integrator Astadia is aiming to provide something rather more than just a gimmick with its "Cloud IT Transformation (ITX) ROI Calculator", which is now available online. Results for this free service are intended to help business leaders and IT teams estimate the financial returns of their cloud computing projects.

The calculator itself offers the chance to input data covering the network, storage, operating systems and database being potentially driven forward to the cloud under the service's IT Infrastructure option. Further fields exist including IT Services, Software Licences, IT Development, IT Project, Time and Results.

Final calculations are delivered expressed as the total number of hours saved, the monetary value of the transformation cost and total ROI.


With results available in British Pounds Sterling and in US Dollars, the company hopes that the service will be accurate enough to give enterprise IT teams a helping hand as they navigate cloud implementations on every major platform including: Amazon Web Services, Google and the

The company says that many IT leaders struggle when trying to assess the cloud platform that is right for their needs. Others lack time or resources to build the business case to move specific technologies to the chosen platform. Astadia hopes its ITX ROI Calculator will give a fast estimate of which pieces of an IT infrastructure and application portfolio will cost less and perform better when moved to the Cloud.

"This new tool shows how much money I.T. departments could be leaving on the table," says Cory Vander Jagt, director, Astadia ITX. "Premise based IT solutions leak money at three different inflection points whenever a company gets bigger or smaller. Standardising an application environment in the cloud consistently provides elasticity at all three points."

Google launches Instant, but still doesn't share the 'secret sauce'

bridgwatera | 3 Comments
| More

Google's leapfrog from search term Suggest (now renamed Autocomplete) to the new "Google Instant" predictive results service is impressive, but it would be so much more compelling if the search giant explained a little more about how its internal software development teams bring about the changes that they do.

It took years for Google to even open up a view of its data centres to the outside world, so we don't expect the full recipe for Instant's 'secret sauce' -- but a little more background in terms of engineering would be welcome.

Presented yesterday by the company's VP of search product and user experience Marissa Meyer, Google Instant is a new infrastructure enhancement designed to make search easier by showing you results right on the homepage as you type.


Google says that it estimates that Instant will save typical searchers between two and five seconds on every query -- and that if every Google searcher used Instant, this would save more than 11 hours per second.

This is about as deep as this week's announcement gets, "On the back-end, Instant is pushing the limits of our technology and infrastructure. For typical searches, we estimate we'll show between five and seven times as many results pages as before. With Google already serving more than one billion searches each day, we needed to find a way to efficiently serve all that content."

When asked for more information on the programmer/developer angle of this announcement, Clara Armand-Delille, corporate communications and public affairs manager for Google UK said, "With Instant, we're introducing a couple of new features that include dynamically displayed results right on the homepage so you can quickly review and find your search result. Predictive text saves time by guessing your search so you don't have to finish typing. Lastly, scroll to search enables you to manually scroll through a list of Autocomplete predictions with the arrow keys and instantly see results for each."

Ah so no special sauce - so far this is about all we know, the Google algorithm is as follows...

PR(A) = (1-d) + d(PR(t1)/C(t1) + ... + PR(tn)/C(tn))

PR(A) stands for the Google page rank of our arbitrary example page A

t1 - tn are the pages that link to page A

C is the number of outbound links that a page has and in this case our C variable is examining pages t1 to tn

d is a damping factor, which is usually set to 0.85 - this is a standard function used when working with numerical algorithms

It is likely that Google has progressed the current form of this equation, but that it is still largely based on this initial form. As for secret sauce, no luck this time.

IBM "Summit at Start": sustainable future IT cities

bridgwatera | No Comments
| More

IBM has already shown itself to be extremely fond of its "Better Software = Better Business" tagline, which it has used at its Rational software developer events in various forms for most of the past decade.

Taking the company's business-to-IT connection concept forward this week is an event in London called "IBM Summit at Start", which forms the business component of Start, an initiative established by HRH The Prince of Wales to support sustainable living.

IBM says that throughout the summit, more than 1000 senior executives and policymakers are confirmed to speak or engage in debate, including some of the UK's most prominent CEOs and public sector leaders. Themes to be discussed include: the future of cities, energy, transport, skills, future business leaders, smarter supply chains, financing sustainability and the information revolution.

Addressing the open day delegation, Stephen Leonard, chief executive IBM UK & Ireland said, "Business is at the heart of cities - from the way employees work to the buildings that we work in, to the energy we consume and the type of work we undertake. Progress in sustainability will only come where positive sustainable outcomes achieve tangible business benefits."

This is not an "IBM Conference" but a debate and discussion across the business community - Start is a national initiative by The Prince's Charities to promote and celebrate sustainable living. It aims to demonstrate what a more energy efficient, cleaner and healthier future could look like.

IBM's Summit at Start runs from Wednesday, September 8th to Thursday, September 16th, 2010.

You can watch The Smarter City - premiered at 'The IBM Summit at Start' below:

SCM Systems - excuse me, but you have you got a license for that thing?

bridgwatera | No Comments
| More

Software Configuration Management (SCM) vendor Perforce has just added "subscription pricing" to its licensing options. The concept here being that customers get more choice if they can buy a simple one-year subscription for as little as $360 per user.

The company says that subscription licenses enjoy the full power of Perforce, including all integrations and plug-ins, software upgrades and technical support. Upon renewal, a portion of the subscription fees may be applied to the eventual purchase of perpetual licenses.

Perforce's SCM technology proposition hinges around the fact that software application development projects are inherently complex -- and that spiraling binaries, project skew and staff changes all necessitate a core central code repository (with layers of management control) to holistically coalesce and control the complete job in hand.

"Some companies want a powerful and scalable SCM system but are uncertain how many licenses they may need long term," said Dave Robertson, VP International for Perforce Software. "Rather than pay full price up-front for a perpetual license, the subscription option makes it more cost effective for new and existing customers to add or remove licenses on an annual basis."

"Pseudo-language" for Windows network scripting

bridgwatera | No Comments
| More

Binary Research International is talking up the latest version of its FastTrack Scripting Host version 6.0, a "pseudo-language" designed to handle the scripting needs of system and network administrators in a Windows network.


The company says that this new version features a full native interface for any operation performed on Active Directory users, groups, computers and organisational units.

"Active Directory is notoriously difficult to script; even the 'experts' have difficulty producing an error-free script," said Lars Pedersen, lead developer of FastTrack Scripting Host. "With existing scripting tools maybe three out of 100 administrators can write advanced scripts related to Active Directory. With FastTrack, 100 of 100 can do it."

In terms of where the company gets the term "pseudo-language", it justifies this by saying that while Microsoft moves scripting more towards actual programming, FastTrack Scripting Host goes in the complete opposite direction.

"A good systems administrator is typically one that knows about infrastructure, not programming. While FastTrack is a scripting language, it's so high-level and easy to use that it resembles configuration more than actual programming," added Pedersen.

A free trial is available for download here.

Integrating Adobe Flash with .NET in Microsoft Visual Studio

bridgwatera | No Comments
| More

In a partnership move featuring two of the more imaginatively named technology companies around, Midnight Coders and SapphireSteel Software are snuggling up together to integrate the Adobe Flash Platform with Microsoft .NET.


When it comes to building Rich Internet Applications (RIAs) for the .NET environment, software developers have generally preferred to use the Visual Studio Integrated Development Environment (IDE).

It is generally argued that Visual Studio is easier for pure Microsoft environments (such as for Silverlight client to .NET services), but of course Silverlight doesn't enjoy the wide installed base that Flash has (99% of all Internet-enabled desktops - if you believe Adobe's figures).

With the release of SapphireSteel's Amethyst, the company says that developers now have an easy way to create Flash-based applications right inside Visual Studio. SapphireSteel also says that the problem of linking Flash with .NET is solved by Midnight Coders' WebORB which provides end-to-end client-server application development across the two platforms.

SapphireSteel's official launch of Amethyst is this week and the integration for WebORB and Amethyst is well underway and will launch by end-of-year.

Subscribe to blog feed

About this Archive

This page is an archive of entries from September 2010 listed from newest to oldest.

August 2010 is the previous archive.

October 2010 is the next archive.

Find recent content on the main index or look in the archives to find all content.