September 2013 Archives

Intel's gamesmaster on breaking a mutant into triangles

bridgwatera | No Comments
| More

To glean some insight into how modern video games development is shaping up today, the Computer Weekly Developer Network caught up with Richard Huddy, European gaming enablement manager at Intel.
Richard-Huddy-Inside-Intel.jpg
Huddy explains that he teaches games developers how to write games -- but what he really does is explain how the technology inside his firm's Graphical Processing Units (GPUs) and Central Processing Units (CPUs) works. In this way, games programmers can be encouraged to look at processor optimisation and the need to define areas where applications can benefit from parallelism and concurrency threading.

The parallelism here is on the graphics side as lots of pixels are "in flight" at any one time through the GPU.

"Games development is different in this way... i.e. getting the best out of a GPU with multiple cores is quite different to getting the best out of a CPU," explains Huddy.

Manufacturing mutants
1 mutat.jpg
In terms of how an image is manipulated by a game engine, a mutant killer character (for example) is broken up into lots of triangles. There are lots of pixels associated with each triangle -- and lots of triangles associated with each character... plus there are lots of parts of the model in the total character (with different textures for example) ... so this is (for example) where parallelism could be brought to bear, as different processing tasks can focus on creating different parts and/or aspects of the image.

Image credit: http://fallout.wikia.com/

NOTE: Rasterisation is the process of working with an image that has been described in a vector graphics format and then subsequently converting it into a raster image (in the form of pixels or dots) for output to video.

Huddy warns that "ray tracing" is the other technology used here.... Although this is used more in movies (and "micro polygon rendering" also comes into play if you want to get technical) -- although again this is not used in PC games.

"Intel does a great Ray Tracer product called EMBREE and this is out in the public domain -- open sourced. This is great way of using CPU cores for rendering images really quickly," said Huddy.

Tomb Raider tricks

If you have noticed the amazing camera spin when playing Tomb Raider 2013, you'll have seen the panning camera sweep around Lara Croft start to blur the background if you keep spinning - almost like a real life person's view of a dizzying spin.

"Tomb Raider's camera spin is fundamental to creating a more realistic camera experience -- video games used to show images in perfect focus everywhere, but movies have used techniques to blur out... now video games are building in this same kind of 'temporal coherence' too," said Huddy.

NOTE: Pigeons see at 1000 Hz, humans 60 Hz -- and everything else we start to blur out.

Perceptual computing to take account of user eye movements will start to be important in games as we go forward now.

"Specialist skills in games development are different to those seen within business applications development -- the same bad habits might exist, pizza, coffee... one skill here would be a "willingness to solve impossible problems" if you look at what GPUs have been doing in the last few years (look at Battlefield) ... developers have tackled huge scale problems and tiny nuances in rendering to to produce a fake for artificial intelligence," said Intel's Huddy.

Ada language comes of age

bridgwatera | No Comments
| More

Commercial Ada software solutions company AdaCore has launched a free training programme for developers and hobbyists interested in understanding and learning Ada.

AdaCore describes Ada as a "state-of-the-art" programming language designed for large, long-lived applications where an environment demands safety, security and reliability as critical elements.
AdaLovelaceplaque-1.JPG
NOTE: Ada is named for Augusta Ada Byron, Countess of Lovelace (1815-1852), who helped Charles Babbage conceive how programs might run in his mechanical Analytical Engine.

The use of Ada is spreading

AdaCore says that today, in 2013, as these considerations for safety and security affect more and more projects, the use of Ada is spreading across the world and into multiple sectors.

To meet this need AdaCore University has been created this September.

It is a free online resource centre that offers pre-recorded courses and other learning materials on Ada, with access to AdaCore's open source GNAT Ada toolset for writing and running example programs. Students at all levels of experience and expertise can begin writing programs quickly and can proceed at their own pace.
logo-isolated.png
AdaCore University courses educate through examples, allowing students to see, understand and experiment with most features of the Ada programming language. Drawing on the experience and teaching credentials of Ada experts such as New York University Emeritus Professors and AdaCore founders Robert Dewar and Edmond Schonberg, the courses explain Ada's technical concepts with insight into the rationale and usage of particular features.

The initial curriculum includes two courses:

Ada 001, "Overview" - a module that presents an overall picture of the language and that allows students to write small programs; and ...

Ada 002, "Basic Concepts" - the first in a formal series of Ada classes, introducing basic Ada programming concepts and allowing students to write programs based on these features.

Both of these modules, and all future courses, provide sources and installation instructions for all learning materials and tools. The courses cover the latest version of the Ada language (Ada 2012), and students have access to AdaCore's GNAT Ada development environment and programming tools.

AdaCore University is an ongoing, live project that will be expanded to include more advanced courses on Ada. It is endorsed by leading non-profit organisations dedicated to sustaining and promoting the Ada programming language, including the Ada Resource Association, SIGAda, Ada Deutschland, Ada France and Ada-Europe.

What is a digital asset?

bridgwatera | 1 Comment
| More

As software application development now spans an infinite variety of user touchpoints across a multiplicity of devices, is it fair to assume that our definition of digital assets should also expand?

For a base definition of digital assets, we can say the following:

DIGITAL ASSETS = DATA (in files) + CONTEXT (in the form of metadata)

So as digital assets now start to represent every tangible (often mostly intangible) "thing" that we can assert ownership over, we reach the point where we also want to control and add value to those assets...

... once again, our view of the total digital assets universe is expanding.

There are more "types" of digital assets today and we so we have to broaden our definition of the known universe.

A presentation by Intel futurologists and computing evangelists at the firm's Intel Developer Forum 2013 presented the following slide with 30 new digital assets.

1 new digital assets.JPG

If we accept this broader universe to be so, then there are direct implications for Digital Asset Management (DAM) and the rest of the computing infrastructure upon which we now base our home and work lives.

If you thought digital assets stopped at files and folders, perhaps it is time to reassess.

Box: business tools are broken, content needs resuscitation

bridgwatera | No Comments
| More

This week's BoxWorks 13 has seen the firm put out an increasingly confident set of messages to explain where its secure content sharing platform is going.

But first a question: what is a content sharing platform if it's not just a cloud storage service?

The firm sent out details for this week's activities to certain attendees' own Box accounts (you can get 10 Gig free or sign up for 50 Gig) as a Word document.

But this is a Word doc with the option to comment and list messages upon it. This is a Word doc that shows members of a designated collaborative work group that any user can connect to. This is a Word doc that will send alerts to logged in users as and when any updates are posted to it. This is a word doc that will ultimately enjoy metadata enrichment so that we know information about the information within.

This is not just a Word doc anymore then.

But how will this benefit business?

Box CEO and co-founder Aaron Levie sees a channel for end users to adopt this kind of technology virally (that's why the basic account is free) and this if you like is stage one in terms of putting new tools in users' hands.

Levie asserts that traditional business tools are essentially broken.

Legacy tools, he says, (including those from as recently as the last decade) do not fit the way we work today and have essentially overshot how users want to work -- but Levie sees a way to "retrofit" existing older technology towards new tools.

During a breakout session Levie told press, "The problem is that enterprises today have a) either high-level consumer tools that do not integrate well with the deeper level backend enterprise data backbone... or they have b) deep enterprise tools that are hard to use. What we want to do is bridge and balance between these two worlds. The Holy Grail is making this happen and providing a route for the right metadata to be also incorporated."

The company's freshly launched Box Notes tool bids to get some way to the new world of content usage -- it is designed to provide a lightweight editing interface to create content and facilitate collaboration directly on Box.

NOTE: Box is currently accepting sign-ups for a limited private beta.

"From day one, our mission has been to make it incredibly easy to store and share information," said Levie.

Aaron-Levie_BoxWorks-2013-keynote-736x368.jpg

"As business becomes increasingly fast-paced, fluid and mobile, the very nature of information is changing, and the tools businesses use to capture that information need to evolve. To unlock the collective knowledge of organisations, Box needs to not only be a place where people can store and share content, but also a place where they can create and capture ideas in real-time across teams."

The firm is NOT setting out to replace Microsoft Office or create the next game changing word processor - it is instead trying to convince us that real time concurrent editing and in-line toolbars and annotations should be a part of the way we use information tools.

For example, the Collaborator Presence/Note Heads function allows users to see who is collaborating on a Box Note in real-time with a user profile picture that follows your cursor in the left hand side of the screen, letting your colleagues easily see where you're working in the Box Note.

a box_notes_hilites_736x368.jpg

There is security here in the form of "sophisticated" data encryption and advanced account settings.

There are also rich media embeds i.e the ability to drop in rich media such as videos, images and audio and also offline editing to update a Box Note so that changes will be automatically synchronised the next time a user connects online.

Broken business tools may be too strong and another case of Levie playing to the crowd -- he's good at that.

To state that business tools need cloud-empowerment is more of a certainty and Box is challenging Google docs, Dropbox and Microsoft Office 365 in this space in a maverick fashion. The proposition to code to these "new" platforms is our next point of focus and you will hear more of this from here in.

What to expect from BoxWorks 2013

bridgwatera | No Comments
| More

Box CEO Aaron Levie has already said that he feels the IT industry needs a "more innovative and more prosperous" version of Microsoft -- although he is admittedly not looking to become the next Steve Ballmer.
levie.jpg
a box.JPG
NOTE: He has too much hair and is straighter talking, plus he eats less... we'll stop there, the list goes on.

What else Levie says this week will be interesting to hear.

This September sees his company's second BoxWorks conference staged in San Francisco and the firm is (arguably) already acting with the swagger and gravitas of a much larger corporate -- although in an (arguably) more understated fashion.

Box is of course a cloud storage company, but it wants to be known as a "secure content sharing platform" --

-- hence the developer angle is close at hand.

Box content can in fact be shared internally (i.e. inside the application by logged in users themselves) but also externally (i.e. accessed through iPad, iPhone, Android and Windows Phone applications) plus it can also be extended to partner applications such as Google Apps, NetSuite and Salesforce --

-- hence the developer angle is close at hand.

Box Dev Day

The firm is pushing its developer credentials forward strongly enough to segment a complete DevDay developer day section where tracks can be found with labels like "Making things people want: design essentials" among others.

The "Up and to the right: growth and scale" session is offered on the basis of the following direct call to programmers: "Once you've identified your target customer profile, have a clear sense of the value your product offers, and have a firm grasp of the economics, it's time to scale."

Partners here include Parse (as in the Facebook backend server connections company), Okta (the secure cloud enterprise-wide identity management people) and Firebase (as in the scalable, real-time backend web application specialists) -- these firms are all about helping programmers to build collaborative applications without the hassle of managing servers or writing server-side code --

-- hence the developer angle is close at hand.

In his keynote address for BoxWorks 2012, Levie said there were three factors heightening demand for content and collaboration in the cloud:

  • unprecedented growth in post-PC devices (more than 1.3 billion smartphones and tablets are predicted to ship in 2016, according to IDC);
  • a projected 100-percent increase in the number of mobile workers worldwide by 2015 (IDC);
  • and an explosion in the amount of data that organisations are creating and need to manage.

-- hence the CLOUD developer angle is close at hand.

Intel #IDF13: Should computing change the world, or make electronic coffee cups?

bridgwatera | No Comments
| More

There's some thought provoking discussion going on at Intel Developer Forum this week.

Intel is rolling out a bunch of new processors and developer suites and you can read about those items in the product blogs...

... but it is the futurists and the microprocessor evangelists who have some of the most interesting material up their sleeves.

NOTE: Never be afraid to read content proposed by a so-called "tech evangelist", while 5% of disreputable tech companies use this role as glorified marketing officer, 95% (OK, maybe 90%) of firms including Intel give this position to truly worthy people who have technical backgrounds but also have a strategic eye on social science and technical usability etc.

As previously noted on Computer Weekly, joint co-chief evangelist for Intel is Steve Brown - he's British, but he's lived in American long enough to start saying "awesome!"... let's forgive him for now, please stay with me.

From 1500 nanometre to 22

Painting a picture of where technological advancement is now, Brown reminded us that the big old chunky Motorola carphones that we started out with in the 1980s were based upon 1500 nanometre technology.

By comparison, today's smartphones feature 22 nanometre technology.

NOTE: It is important to remember that Intel is working on nanometre level technology engineering where we should think about one nanometre in these terms --- if the diameter of a marble was one nanometre, then the diameter of the Earth would be roughly one metre.

We might also like to remind ourselves that smartphones today now have more performance that a Pentium 4 did when it arrived back in 2000.

So now we come to the point of our story... i.e. should we use computing power to change the world, or make electronic coffee cups?

In a breakout "cookies and coffee" session with Intel's Steve Brown we discussed the following issue --- when we have enough computing power to do anything we want to, what we need to decide is exactly what DO WE DO with it.

For example, if we have enough intelligence to be able to automate a coffee cup to tell us whether it is too hot to hold or not, should we REALLY be doing that anyway when there are children starving in Africa that could do with their country's food supply infrastructures being automated instead so that they do not starve?

Never knowingly underevangelised, Intel's Brown immediately reminded us that he has already provided us with a TED talk on exactly this type of subject which you can view below.

VIDEO -- Why machines must make us better humans - Steve Brown at TED@Intel:

Intel Developer Forum: the context-aware app cometh

bridgwatera | No Comments
| More

Steve Brown, Intel chief evangelist and futurist kicked off this San Francisco located event this year with what is now (for all vendors) known as the "day zero" pre-show briefing.

The company is using this year's gathering of programmers and other IT pros to explain where its vision of future devices and applications usage is headed from both a hardware (device), component (processor and other central architectural elements) and software (applications) perspective.

So what of the future?

Brown explained that when we look at what computing can do for us next, we must look to how contextually-aware personally tailored applications will be developed.

A human genome takes up about a petabyte of data... if we can have all of this information at our fingertips, then there is a huge possibility to change the future of medicine and start to tailor the medicine prescribed to each individual person.

IDF image 1.JPG

Lama Nachman, principal engineer of Intel Lab spoke on the subject of context awareness.

Nacham Hates 'micro-management' and the fact that she has to keep to configuring her favourite apps whenever she travels.

Devices and, perhaps more crucially applications, need to become more context aware she said.

"Communications and alerts need to become more than some unified equal layer of data... I want to know the difference between information contained in a call or alert that is simply a friend wanting to chat and one that is going to tell me my house is burning down. Devices should know whether they are being used in noisy environments ... maybe due to sound sensors... maybe due to geo-location sensors, or both."

Ambient context awareness

Continuing the Intel futurist train of thought, next up was Ravi Iyer, director and senior principal engineer Intel Labs.

Iyer's view of contextual applications are housed on devices that are either "on you, or around you" and that understand context.

Today you might have a phone, set of keys with electronic key fob, maybe even a headset/microphone. Even as we stand today we find some of these items bulky and we would like to reduce the size (and usually also increase the power) of these things in the future.

INTEL'S GOSPEL: Ultra-low power and ultra-low cost should be a given for future devices and applications and these units should employ ambient environment context awareness intelligence.

This soon-to-be-experienced future could feature new input mechanisms -- not just through UIs and or via speech, but perhaps by simply moving the device or some as yet undeveloped (but still perhaps quite obvious) means such as shaking via an accelerometer.

The device will need to work on ultra low power .. operating at microwatts or milliwatts as opposed to watts (so that battery life lasts for perhaps a week) if it is (for example) a wearable health-type device that just needs to come to life for a second, take your blood pressure or heart rate etc. - and then go back to sleep.

Energy harvesting

Technologies including NFC and also "energy harvesting" come into play here.

Brian David Johnson, futurist and evangelist -- talked on the subject of 2020 when the computer approaches zero (in terms of size), but also zero in terms of cost and zero in terms of power consumption such that --- these devices can start to pick up "ambient energy" and this is everything from solar, to kinetic, to radio wave energy that could be tapped into and others.

IDF image 2.JPG

A brief history of devices according to Intel

Mainframe
Mini
Workstation
PC
Laptop
Mobile
Ubiquitous computing devices (in the Internet of Things and wearable technology etc.)

Lessons from Copernicus

Dr Tony Salvador rounded out this discussion in his position as director and senior principal engineer for Intel Labs.

Salvador's story hinged around the fact that we used to believe that the view of the sky above us was all that we could see. But when Copernicus started looking at the sky in around 1541 AD, he started thinking about the construction of a far bigger system ... the first images of the heavens were based upon belief (belief based data if you like) but Copernicus moved us on to data driven belief... such that a full heliocentric view of the solar system was constructed...

... and this is an over-riding theme for the way we will move to use data in the future i.e. context-based data related to what is actually around us (and the process of how we work with it) will shape our world in the next decade.

What to expect from Intel Developer Forum 2013

bridgwatera | No Comments
| More

Intel kicks of its eponymously named developer forum (IDF to the cognoscenti if you don't mind) next week in San Francisco's Moscone center/centre for what is its 13th (lucky for some) year.

This year we can expect to hear the firm talk more about enterprise cloud adoption in an open data centre world -- not the first source of cloud spokespersons for some, it will be interesting to hear Intel's position.

Build, with chips?

As a "developer" symposium one might expect this to be a case of Microsoft's BUILD, but with chips... (with chips and salsa even?) and that should indeed be the case.

Indeed, Intel will no doubt be focusing on its latest line of processors and coprocessors, including Haswell-EP, Broadwell and the range of Intel Xeon Phi coprocessors.

In fact, some of the big news is already out and this week we have seen the release of updates to two flagship development suites: Intel Cluster Studio XE 2013 and Intel Parallel Studio XE 2013.

Processors & coprocessors

According to Intel, these 'integrated' suites are designed to help C++ and Fortran developers create applications for the latest Intel processors and coprocessors, including Haswell-EP, Broadwell and the range of Intel Xeon Phi coprocessors.

Partners will be busy too... also this week (before the show) we have seen Haskell software tools and services developer FP Complete launch the first commercial Haskell IDE and deployment platform.

NOTE: Open source Haskell is a purely functional programming language for rapid development with strong integration with other languages, built-in concurrency and parallelism, debuggers and libraries.

Eliminating spaghetti code

Intel technology partner FP Complete blames bug fixing, rewriting and maintenance cycles for delays on many software development project today -- the firm says that the "root cause" of these problems is the inefficient and error-prone nature of imperative programming languages (Java, C family, Python, Ruby etc.) where they are dominant today.

Also in the pipeline is a web TV product from Intel... although details are sketchy at this stage.

From this point we could be getting into the really interesting stuff where we get to look at wearable technologies, facial and gesture recognition, where the next shape of the UltraBook could potentially go and there's go to be news on the Thunderbolt port too.

a intek.jpg

This is a mere stream of consciousness without hearing any of the real meat; the show itself starts next week.

What is bare-metal cloud?

bridgwatera | No Comments
| More

The cloud computing model of service-based IT delivery has been over-hyped, over-sold and (in many respects) over-talked. This space's next big thing is the high-performance bare metal cloud solution.

But what is bare metal cloud? Who uses it? What does it do? -- and how should developers code for this environment?

Guest blogging with some much needed answers on the Computer Weekly Developer Network is Gopala Tumuluri, VP of hosted services at IT infrastructure solutions company Internap.
1c12ea3.jpeg
Organisations' collective demands for flexibility, scalability and efficiency have driven them flocking to public cloud infrastructure services, representing (as they do) an opportunity for cutting IT costs while capitalising on technology innovations.

But, just a few short years into the cloud revolution, new options have appeared in response to the stress that high-performance workloads can put on traditional public clouds.

Degradation situations

Performance degradation can often occur, stemming from the introduction of a hypervisor layer. While the hypervisor enables the visibility, flexibility and management capabilities required to run multiple virtual machines on a single box, it also creates additional processing overhead.

For application architectures that demand high levels of data throughput, the 'noisy neighbour' side-effect of the multi-tenant design of virtualised cloud environments can be constraining.

Multi-tenant virtualised public cloud platforms result in virtual machines competing and restricting I/O for data-intensive workloads, leading to an inefficient and inconsistent performance.

How is bare-metal cloud different?

The bare-metal cloud provides a way to complement or substitute virtualised cloud services with a dedicated server environment that eliminates the overhead of virtualisation without sacrificing flexibility, scalability and efficiency.

Bare-metal cloud servers do not run a hypervisor, are not virtualised -- but can still be delivered via a cloud-like service model.

This balances the scalability and automation of the virtualised cloud with the performance and speed of a dedicated server. The hardware is fully dedicated, including any additional storage. Bare-metal cloud instances can be provisioned and decommissioned via a web-based portal or API, providing access to high-performance dedicated servers on demand.

Also, depending on the application and use case, a single bare-metal cloud server can often support larger workloads than multiple, similarly sized VMs.

Which workloads see the most benefits?

High-performance, bare-metal cloud functionality is ideal for instances where there is a need to perform short-term, data-intensive functions without any kind of latency or overhead delays, such as big data applications, media encoding or render farms.

In the past, organisations couldn't put these workloads into the cloud, without accepting lower performance levels. Organisations having to adhere to rigorous compliance guidelines are also good candidates for bare-metal cloud.

How should developers code for bare-metal cloud?

An intrinsic benefit of the bare-metal cloud environment for developers is that no special considerations need to be made when coding for these servers; this is in contrast to traditional virtualised environments, where developers need to account for a potentially untold number of 'noisy neighbours' on the same box when coding.

Is DevOps is getting sexier?

bridgwatera | No Comments
| More

If you believe the hype cycles, DevOps is getting sexier.

Well, to be more accurate (and bit less headline grabbing-ly dramatic), DevOps is becoming a more discernable real-world technology proposition and can now be perceived as a tangible discipline inside datacentres and developments teams.

If DevOps started out sounding like it was born out of marketing spin emanating from the application release and deployment management vendors, it's now a bit clearer how we might employ the term.

As the intersection point between Dev-elopers and Op-erations (database administrators, sysadmins, network admins etc.) -- perhaps we can we make DevOps sexier if we remind ourselves that application problems can happen anywhere?

Software application development problems can occur on the end user device, on the network, inside infrastructures or in the application code.

Borat.png

IT operations staff and developers often know there's a problem but can't get to the right level of analysis quickly or easily enough to minimise and mitigate the impact on the end user experience.

Frequently, expert staff -- be they Level 3 network experts or application developers themselves -- must be taken off key projects in order to troubleshoot issues.

Enter sexy-time DevOps.

New improved Mr (or Ms, or Mrs, or Miss) DevOps is now capable of wielding a single appliance with integrated "application-aware" network performance management (aaNPM).

Suddenly DevOps isn't just "yeah, we'll try and look after your code build release and give it a bump start if it needs it"...

... DevOps is (all of that) plus a more top-of-the-line solution where end-to-end performance management extends from deep-dive packet and network analysis through to application transactions and the end user experience, or (EUE) if you will.

So aaNPM for better EUE then?

This DevOps practice is all about bringing network intelligence to application performance -- and Riverbed Technology's Shark module for AppResponse Xpert is one such beast.

According to a new Forrester Consulting study commissioned in August 2013 by Riverbed, "Forrester's Ideal Tool Set for Application Performance Management for Better Business Performance," 52 percent of the IT operations surveyed waste more than 20 percent of their operational resources to track and correct problems.

On top of the business productivity loss and the damage to the brand, IT productivity also suffers from this state of affairs, as resources are called from their normal work to perform unplanned and unscheduled tasks.

So the world of APM and NPM (Application, and, Network Performance Management) must now come together if IT organisations are going to be successful in making the transition to application-centric operations.

Could an APM/NPM convergence make DevOps sexier?

Let me pour you a glass of wine while I undress this multi-terabyte application network packet recording while you decide.

CV keyword bingo: HTML5, Hadoop & MongoDB

bridgwatera | No Comments
| More

Job search website indeed.com has compiled a job trends report with a supporting graphical analysis to detail to the top keywords that job hunters submit and employers look for when it comes to evaluating curriculum vitae's (CV) or résumés, if you prefer the Franco-Americanism.

The key light-switch-on word or term applicants can use is (perhaps unsurprisingly) HTML5.

"HTML5" is the #1 job trend - the fastest growing keyword found in online job postings - ahead of "MongoDB" in second place and "iOS" in third place says the company.

The full list of top ten keywords (shown below) suggests (again perhaps unsurprisingly) that web, mobile, big data and cloud all lead in terms of what job hunters are putting forward as their main skills today.

Social media makes a respectable entrance in tenth place also.

1. HTML5
2. MongoDB
3. iOS
4. Android
5. Mobile app
6. Puppet
7. Hadoop
8. jQuery
9. PaaS
10. Social Media

a screen.png

You can use indeed's search tool to look up any words or phrases by entering them in the search box at http://www.indeed.com/jobtrends.

Subscribe to blog feed

About this Archive

This page is an archive of entries from September 2013 listed from newest to oldest.

August 2013 is the previous archive.

October 2013 is the next archive.

Find recent content on the main index or look in the archives to find all content.