Krome Studios upgrades to 10 gigabit switches

Games developer Krome Studios has installed new 10 gigabit switches to connect its two offices, a project that included laying new fibre between two offices.

Australia's largest games developer, Krome Studios, has upgraded its network to 10 Gigabit switches with the help of Netgear and now finds its biggest problem is disk storage at the back-end of the server which can't keep up with the speed of the network.

SearchNetworking ANZ's Ian Yates interviewed Krome's IT Manager, Jason Muir, to find out how he managed the upgrade of the network across four offices, what was involved in installing the dark fibre link between the two adjacent offices in Brisbane and how they have found the transmission of data across the backbone since the upgrade.

Ian Yates (IY): You put some fibre across the road - just how do you put fibre across the road?

Jason Muir (JM): Basically we had wireless to start off with and we decided that the wireless wasn't fast enough so we needed to put fibre in so we had to contract a company called Pipe Networks. They actually specialise in doing this sort of stuff and they tunnelled under the ground up a lane way, across a road and terminated the fibre for us.

IY: OK. I knew Telstra can do this because Telstra apparently has some special clause in its legislation that says it can do what it likes when it comes to hole digging. But I wondered, you didn't have to go and get permission from somebody? You just had to be careful that you didn't drill a hole through the wrong things?

JM: They've got to get certain permissions and authority to do the work because it's got to come from the council and from the relevant authorities like Telstra and things like that that do have to say yes, it is OK and it's not going to impact any of our services. Once they say that's all OK then the guys can proceed.

IY: These tunnelling companies are obviously all set up to do that. You just hand it over and they make it happen yes?

JM: Pretty much.

IY: From what I can gather from the last time I was involved in a similar project, I got the impression that the guys doing the tunnelling were making more money than the guys selling the network kit.

JM: Yeah that's about right! It's not cheap to tunnel across a road. But the performance benefits were there and the business need was there for us to go to such a length to get that service in.

IY: I guess if you didn't have that high-speed link it would restrict the kinds of things you could do in the other building. You couldn't treat it like a normal part of the network.

JM: That's right. How we were doing things, we had to have a remote server so that it had local caches of files and things like that. Just so their speed across the network was sufficient. With this fibre now we don't have any gear in that building except for Netgear networking kit. The ten gig networking gear and switches and stuff like that. All of the servers and things are housed back in the main head office in our server room. We have got a much better and more controlled environment that we can keep those servers in.

IY: Not many people have gone to ten gig now. Did you really notice? Did the ten gig actually improve performance or were your servers unable to take advantage of it?

JM: We have been running aggregated links in our servers for a while now.

IY: So you were getting more than one gig by running multiples?

JM: Yeah well we weren't really seeing it at the desktop end. It wasn't evident before, with the gigabit but by aggregating the links, we've found that we are able to suck that much data across the network that we're finding that things like RAID cards are now becoming the bottleneck and the server can't deliver the data fast enough to us. We have actually reached thresholds in servers with network. We found there was no benefit beyond a certain number of network points so we're now looking at upgrading our servers.

IY: Are you doing something with one of the storage vendors? I think they have got some high speed links coming out of the back of their boxes haven't they?

JM: That's right. One of the main advantages of us running our own private fibre is that it does free us up to do a lot of different things so if we want to look at iSCSI and things like that we can run fibre channel technologies across the dark fibre and things like that. It gives us a lot more freedom than we had before.

IY: It would even give you the opportunity to do things like put disaster recovery centres across the road if you felt inclined.

JM: That's right. That's something that we have been looking at as well.

IY: Like splitting your servers across there, not for performance but for safety and back up and so forth.

JM: That's right.

IY: So you'd recommend 10Gig? It costs a bit but the benefits are worth the money?

JM: Yeah. We've gone from running multiple aggregated links between floors and things like that; we had multiple aggregated links to just get some bandwidth in the back bone. We've replaced all those with single ten gigabit uplinks and when we start to impact on the performance of those we will then be looking at aggregating those ten gig links. Bandwidth is now not a problem on our network. We are looking at other areas now.

IY: It must be nice to have a network that you just basically can't kill.

JM: That's right. We did struggle in the past at times with and when you've got something like that it makes it hard and you're kind of limited as to the way that you can slice and dice that up. Whereas now we know that the network is not the issue at all and we can concentrate on providing bigger and faster servers and things like that and delivering service. It gives us more parameters to play with basically.

IY: Yeah, you've got one less fence surrounding you that you don't have to climb over.

JM: That's dead right.

Read more on Data centre networking

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.