Skip to content

To France…

A few days ago, a few of “us” have been to france. On a cold November morning in a brand new data center hall, we had a look at some Version 1 OCP racks, and a very nice conversation with a bunch of friendly people interested in getting the foundation going.

A OCP Version 1 rack, with three power zones. You can see the centralized power supplies at the bottom of each of the zones.

See Open Rack Specs and Designs, the Open Rack Standard 1.2 Spec and Facebook Open Rack V1 Specification. There is also the Facebook V1 Power Shelf Specification.

A look at the back of the rack. There are 3×3 12V bars suppying the power zones.
Power handling and distribution is different in the V2 racks. There are adapter sleds.

 

There are a lot of Amps to push at 12V in order to get the 12.5kW per rack. So here goes the output of the power shelves.

 

A winterfell server (E5-26xxV2 CPU), refurbished Facebook hardware. “Vanity free” is what this look is being called, and rightly so.

 

All handling is from the front – slide server into sled, server grabs the power bar. Network, OOB and everything else is from the front, and completely tool-less.

See Open Rack Specs and Designs, “Facebook Open Rack V2 Cubby Sub-Chassis Specification” for the sled and the tray.

A Haystack storage sled, empty. Disks go into this, bare (no carriers needed).

A few slides about the Haystack (PDF).

The haystack has a microserver for cold storage, standalone needs. Or you can connect it with a bunch of SATA cables to servers in a slot on top of the haystack.

 

A Xeon-D in a Yosemite Valley Microserver. This is a potential web node.

The QCT Rackgo X Yosemite Valley + the requisite Microservers. 12 Microservers in 2 OU.

Yosemite Valley Carrier.

 

Backside of the Yosemite.

 

The backside of V1 rack equipment, connects to all 3 power bars.

 

The V2 rack has only the center rail. Sleds with adapters exist.

 

The OCP Version 2 racks handles things a bit differently.

 

A brief look at a QCT V2 Open Compute Rack.
Published inData Centers

2 Comments

  1. kris kris

    Things I like:

    • No video card. Things that have no video card cannot switch to graphics mode during boot just to show vendor logos that nobody is looking at, ever, anyway.
    • The OOB that is there is dedicated, out of band and dumbed down to the bare necessities. Which means it can’t backdoor the machine, so that I can sleep with peace of mind..
    • That haystack was lifted into the rack while we looked, and the process took all of 30 seconds and 0 tools.
    • Dat airflow. Awesome.
    • I haven’t specifically posted detail pictures of the cabling, but while it’s not pretty in the /r/cableporn sense, it’s made to measure, easily maintained even in a changing environment and fast.
  2. Ralf

    Oldfield!

Leave a Reply

Your email address will not be published. Required fields are marked *