QLogic 10000 Series Adapters

QLogic 10000 Series Adapters

April 15, 2013 1 By Eric Shanks

QLogic has introduced a new product that combines their already reliable Fibre Channel host bus adapters with solid state storage in order to do caching.  Think Fusion-IO cards with a Fibre Channel HBA as well.  (Yes I know that’s an over simplification)

10000series

 

 

The new QLogic cards come in 2 flavors.  A 200GB SSD option and a 400GB SSD option, both of which are 8Gb Fibre Channel.  I’ve been told that 8Gb was used to get started with this concept because it was already proven and solid, where as the 16Gb Fibre is much newer.  I’m sure these cards will be a hit and 16Gb Fibre cards are in the works with even larger capacities.

The adapters work much like you would expect, where split writes are created, one to the storage device, and one to the SSD housed on the daughter card.  Then read operations that are still in the SSD cache don’t have to be sent to the SAN.  I’ve been told that these adapters can also be setup as targets so that they can share their cache between adapters and in the future may be able to mirror their cache which would be beneficial for virtualized environments.

Traditionally, caching has been done on the SAN which requires the controllers to still do some work to fetch the data.  Using the HBA to do the caching, you can eliminate the controller from having to retrieve this data leaving more room on the SAN for other things.  Also, the caching done on the SAN might be caching for  100 servers.  The data that is stored in that cache may not be useful to some of the servers in your environment so they don’t get any performance benefit out of it.  Caching on the HBA can guarantee that a specific server is getting the cache benefits.

If caching isn’t done on the SAN and you need extremely high IOPS, Fusion-IO boards have been used in the past to great success.  Unfortunately, this is like adding direct attached storage so it can only benefit one machine at a time unless you’re using some additional replication software.

These QLogic cards can give benefits of both direct attached storage as well as traditional SAN caching at the same time, which is clearly a nice benefit.

Cons

Even though QLogic is advertising over 300,000 IOPS with these cards, they do have a few downsides as of right now, probably because they are so new.

They require a rack mount server to be installed in.  If you have blades these cards won’t work for you.  In addition, these cards require side by side PCI-Express ports available for the HBA and the daughter card which is attached by ribbon cable.  This will limit the number of systems that are good candidates to try these out.  Lastly, even though these cards are dual port, they become a single point of failure because both ports are on a single card.  You could get two of them, but you’re looking at 4 PCI-Express slots being available and I’m guessing this will be pretty unlikely for most people.  The good news is that since these are just used for caching, the data is still available on the SAN so you haven’t lost data because of a card failure.

Thoughts

I think this is a great idea for the future, but probably isn’t going to be mainstream for a while until some of the limitations can be overcome.  Look for Fusion-IO to do something similar to add Fibre Channel functionality to their cards as well.

If this is something interesting to you, check out QLogic’s site and request a demo.  www.qlogic.com