I was in Washington DC last week to meet with several cloud companies, walk one of them through my Silver Spring Maryland Data Center and network at the GovCloud event being held in DC. If you have read my blog at all over the past several years you will quickly figure out that I was not a fan of cloud. I didn't get it because it wasn't mature enough to offer anything better than what I could get at a managed services provider like Savvis/CenturyLink, or latisys (as examples).
What I saw last week has turned me on to what cloud delivers. Cloud computing, and the companies who embrace it, have gone from promising the world and delivering little, to actually looking at the architecture, looking at 'what if's' based on actual IT requirements, applying financial filters to the noise and coming up with a consumable offering.
One of the companies I met with - Piston Cloud - had a solid offering built on Open Stack. I knew their CEO Josh when he was at NASA and I was at CoreSite and we started a big ole sandbox for cloud companies (Eucalyptus, RightScale, etc) to tie into what NASA was doing as well as get their software optimized on hardware platforms. HP was the dominant hardware vendor at the time. Fast forward and I left CoreSite to found a data center company and Josh left NASA to take what he had done at NASA and mature it into a commercially viable cloud OS and start Piston Cloud.
Our meeting was the first time we had connected face to face in a couple of years and we picked up where we left off. There was some reminiscing and laughing at mistakes we had made along the way in getting to where we were and there was something else - an electricity that was palpable when we shifted the discussion to actually using cloud to deliver a real solution - not just 'we do cloud'.
I live in the pipes and boxes/buildings that cloud resources use to provide the elasticity and scalability native to a cloud environment. One of my pet peeves was always the lack of discussion around security, having played in the IDM/Access control space with some large companies a few years ago. The data center I bought bucked the trend in a number of ways and I always believed the cloud vendors would mature and come back to earth and look at not only their public/private/hybrid offering but where they put their environments in the first place. So the data center I bought was NOT in Northern Virginia with 50 other providers, but in Maryland - the other State with hardened bunkers for Government and Military personnel in the event there was a major 'Oh Shit' again. The facility also had a global bank as a tenant so I know the security and the validation of the security would be embedded in the design of the facility - and I was right.
So when Josh and I sat down to talk cloud - and security - Piston Cloud was at a different layer in the stack, but also focused on a hardened solution for the cloud - in their case a hardened OS. Long story short, our core beliefs were embedded in what we were doing - delivering the possibility and the option of a secure Cloud OS from a secure facility with the audit trails to prove it.
There is much more to be discussed but it was great to see another company founder develop a solution that was centered around their core beliefs - security in the cloud is a problem, so let's fix the problem and go to market. I will blog more as time and NDA's allow, but for organizations enamored with the cloud - welcome. And for those organizations really looking for a secure option, I think we may have something worth talking (and blogging) about.
Showing posts with label CIA Data Center. Show all posts
Showing posts with label CIA Data Center. Show all posts
Tuesday, February 28, 2012
Wednesday, June 15, 2011
ByteGrid Launched
I thought it was high time I got around to announcing the launch of my company - ByteGrid. As the name implies it is the fusion of data (Byte) and Electricity & Telecom infrastructure (Grid) and since we are a data center company, fitting as a short & sweet description.
I few friends have encouraged my to blog about the whole experience and now that we're a real company, I find I have a lot less time to tell the whole story but a few things are important for others to know and were extremely important to me personally to know:
1. Building a company is difficult.
2. Starting a company is more difficult
3. Standing on the other side of the Starting line, with the Finish line out ahead of you is incredibly rewarding and puts the difficulty in perspective
4. Many friends and family will be supportive the first 30 days after the decision to go out and do something on your own, and downright mean and skeptical from then on. They are more scared than you are.
5. Do not EVER let someone talk you out of what you know and believe to be the right way to do something, especially when you hear 'If you changed _________ you would get funded faster...' or anything that dilutes your vision. The vision is yours, not theirs and only you know the right way to execute the vision. A lot of people have money. few have a vision, and even fewer the intestinal fortitude to stay true to their vision.
6. Once you take on other people's money, they have a big say in how things get done. And they should.
7. If you are not used to constant change, competing demands, and like things nice and orderly, you won't like what you're doing. The best laid battle plans change the instant the first shot is fired.
8. Be intimidated by no one. You did something that few people ever do, and even fewer stick with for any length of time, and if they don't understand that, they won't, and keep moving.
9. You won't do it alone. After I brought on partners, it was amazing how quickly things came together and the right compliment of skillsets balanced one another. In our case my partners added deep financial expertise, deeper operational expertise, and legal expertise to my sales talents, and we were a well oiled machine and delegate better than any team I have worked with and for because we know who is best equipped to handle a situation in spite of our egos.
10. If you start a company with the sole reason to make a shit ton of money, then almost every decision you make will be short sighted to that end. If you go into business for yourself because you like it, you will do something better than anything you know of, and is a natural extension of who you are, then the shit ton of money will follow and your decisions will be sound, thought out, and you decide what the right price for your efforts are, not a spreadsheet.
Maybe this helps some of you get off the dime to do something, or keeps others from doing something they are not prepared to do. Either way, it's my experience, my opinions, and the next chapter has yet to be written.
Check out the ByteGrid website, and take a look at our first data center we acquired. It's a Tier IV gem, and I will blog next about why we bought it, and why it is a fantastic facility. I will of course be biased, however, I backed up how solid it is with a lot of money-so I put my money where my mouth is too.
If you want a copy of my data center site selection guide - it's still available. mmacauley at bytegrid. com
Monday, December 20, 2010
Site Selection - A Case Study?
I received a call from a friend of mine earlier today who called to ask if I had seen the data center requirement posted for the CIA. I had heard about it and got to thinking about what they would do to kick off the site selection process. Then I realized that if the requirements were posted, they must have done a lot of homework already. You would think anyway. So I thought I would brainstorm here on my blog and take you through my thought process. and then see how much of what I think about are in the posted requirements.
I will qualify this blog post with a disclaimer - the only specs I know are that they want a 200,000 square foot facility built out in 40,000 square foot chunks/phases. I have not read any document or article related to it.
So when I look at the requirement as I understand it - my high level criteria would be:
1. Available inexpensive power, ideally with a green power source that is off grid
2. Available network connections to Government TIC (Trusted Internet Connection) sites
3. Proximity to US military bases to insure that staff can get to a facility if needed
4. Risk profile for natural disasters, man made disasters (civil unrest/planes into buildings, etc), financial condition of location States, geologic topography, and political risk.
So for #1, availability of cheap power and preference to a green power source that is off grid is in the top slot for a reason. Data centers number one expense is power, and data centers are typically operated for 15+ years. Virtualization, while reducing floor space actually increases density and draw of power for more powerful servers, and the power needs to be 'green' per the mandate by Vivek Kundra, the CIO for the United States.
To my knowledge there are two sites that COULD satisfy this requirement today - but it would take a signed contract to mobilze the funds and people to construct the power systems, and one would get knocked out of the running because of proximity to DC proper. A box of anthrax, or a suitcase dirty bomb with nuclear waste within 40 miles would make it 'inoperable' at least on the surface. So this isn't a data center requirment, it's a power plant with one customer - a data center.
On to #2, which deals with network connectivity, and not just in the general sense but specific to a TIC site. There are 100 of them in the US, so that limits things too if that is a dealbreaker - and it should be. Data needs to flow to the facility and out of the facility to provide credible intelligence to our Government and to other Governments friendly with the United States. Since we arent talking DSL pipes, these need to be 100GB pipes or better. Redundant too. This will be expensive since there is not a lot of fiber in the boonies - I know, I live in 'the boonies' (kind of).
Number 3 is important because in the event of some really bad shit going down on a major scale, people need to get in and out of the facility no matter what. The ability to use runways and other infrastructure specific to logistics is crucial. People can fly to a base and get choppered in, HUM-V'ed in or some combination of planes and automobiles. Sorry trains. There is also the 'able to sleep at night' piece having jets and Blackhawks able to scramble and be airborne in seconds to sanitize any threat if needed.
Number 4 should be a given, and arguably #1. When I think about Ashburn VA and the amount of data that is captured, processed and stored at the end of a runway is breathtaking oversight in my opinion. Knowing I can get mobile network reception on the approach to Dulles means that people bent on harming the United States and its citizens can do major harm sitting in Verizon's parking lot and pressing send. Katrina got everyone's attention with natural disasters on a major scale, but what about wildfires that close roads, burn telephone poles, and melt insulation around copper lines? Ice storms that make roads impassable and cause tree branches to cut power and telecommunication lines or the earthquake that hits and while the seismically engineered building hardly feels anything, the 60 miles of conduit housing telecom fiber gets severed by a bridge collapsing or ground shaking separation of the conduit itself? Topography needs to be factored in as well for redundant microwave links, sensors for all sorts of data needing to be captured, analyzed and used in making educated decisions?
I added a vector that has not been too much of an issue to date but one I think about - the financial condition of a State. I will use California as the example - the State is teetering on bankruptcy if you believe the mainstreet media outlets. The issue won't be whether or not the State can afford to keep the power plants operating, but the civil unrest that occurs when people get incredibly pissed off. Mobs like to burn things, flip over cars, and do other things that make no sense to me. Looting happens. If there is no water or electricity all kinds of crazy things can happen. Guess what? Data centers plan to have water and electricity no matter what, making them a target.
The point in all of this, is that before you even start touring facilities, virtualizing, seeing who is out there, and putting together requirements based on square feet and phases, you better have done your homework, or you - CIA data center - will be the next disaster to recover from.
I will qualify this blog post with a disclaimer - the only specs I know are that they want a 200,000 square foot facility built out in 40,000 square foot chunks/phases. I have not read any document or article related to it.
So when I look at the requirement as I understand it - my high level criteria would be:
1. Available inexpensive power, ideally with a green power source that is off grid
2. Available network connections to Government TIC (Trusted Internet Connection) sites
3. Proximity to US military bases to insure that staff can get to a facility if needed
4. Risk profile for natural disasters, man made disasters (civil unrest/planes into buildings, etc), financial condition of location States, geologic topography, and political risk.
So for #1, availability of cheap power and preference to a green power source that is off grid is in the top slot for a reason. Data centers number one expense is power, and data centers are typically operated for 15+ years. Virtualization, while reducing floor space actually increases density and draw of power for more powerful servers, and the power needs to be 'green' per the mandate by Vivek Kundra, the CIO for the United States.
To my knowledge there are two sites that COULD satisfy this requirement today - but it would take a signed contract to mobilze the funds and people to construct the power systems, and one would get knocked out of the running because of proximity to DC proper. A box of anthrax, or a suitcase dirty bomb with nuclear waste within 40 miles would make it 'inoperable' at least on the surface. So this isn't a data center requirment, it's a power plant with one customer - a data center.
On to #2, which deals with network connectivity, and not just in the general sense but specific to a TIC site. There are 100 of them in the US, so that limits things too if that is a dealbreaker - and it should be. Data needs to flow to the facility and out of the facility to provide credible intelligence to our Government and to other Governments friendly with the United States. Since we arent talking DSL pipes, these need to be 100GB pipes or better. Redundant too. This will be expensive since there is not a lot of fiber in the boonies - I know, I live in 'the boonies' (kind of).
Number 3 is important because in the event of some really bad shit going down on a major scale, people need to get in and out of the facility no matter what. The ability to use runways and other infrastructure specific to logistics is crucial. People can fly to a base and get choppered in, HUM-V'ed in or some combination of planes and automobiles. Sorry trains. There is also the 'able to sleep at night' piece having jets and Blackhawks able to scramble and be airborne in seconds to sanitize any threat if needed.
Number 4 should be a given, and arguably #1. When I think about Ashburn VA and the amount of data that is captured, processed and stored at the end of a runway is breathtaking oversight in my opinion. Knowing I can get mobile network reception on the approach to Dulles means that people bent on harming the United States and its citizens can do major harm sitting in Verizon's parking lot and pressing send. Katrina got everyone's attention with natural disasters on a major scale, but what about wildfires that close roads, burn telephone poles, and melt insulation around copper lines? Ice storms that make roads impassable and cause tree branches to cut power and telecommunication lines or the earthquake that hits and while the seismically engineered building hardly feels anything, the 60 miles of conduit housing telecom fiber gets severed by a bridge collapsing or ground shaking separation of the conduit itself? Topography needs to be factored in as well for redundant microwave links, sensors for all sorts of data needing to be captured, analyzed and used in making educated decisions?
I added a vector that has not been too much of an issue to date but one I think about - the financial condition of a State. I will use California as the example - the State is teetering on bankruptcy if you believe the mainstreet media outlets. The issue won't be whether or not the State can afford to keep the power plants operating, but the civil unrest that occurs when people get incredibly pissed off. Mobs like to burn things, flip over cars, and do other things that make no sense to me. Looting happens. If there is no water or electricity all kinds of crazy things can happen. Guess what? Data centers plan to have water and electricity no matter what, making them a target.
The point in all of this, is that before you even start touring facilities, virtualizing, seeing who is out there, and putting together requirements based on square feet and phases, you better have done your homework, or you - CIA data center - will be the next disaster to recover from.
Subscribe to:
Posts (Atom)