Private clouds, the next generation of IT infrastructure or boondoogle

VMware announces vSphere® to spur infrastructures to move to private clouds. Stepping from virtualized servers to fully virtualized infrastructures with applications and users sharing services based on demand need, not preset capacity. Along with their competitors they are pushing for businesses to replace or upgrade to a private cloud infrastructure as the next great thing. Add to this a number of thought leaders promoting the move to private clouds, some putting forth that private clouds are a way for business to step to public clouds smoothly to develop confidence.

Vmware will sell vSphere® to IT leaders who either need a highly secure environment and wish to gain cost and performance advantages of a private cloud or the IT leader convinces himself and his peers that a private cloud is the best answer for the business. But, is it really. Other parts of the business use shared services with full confidence and have for years.

Do you have your own water supply and filtration system, not many companies do, they trust the municipality. If a company makes baked goods or similar product they have their own delivery fleet, but for the rest of us we trust FEDEX to deliver products to customers. Than there is ADP to manage payroll and Internet, voice and cellular connectivity are all shared private services. Than of course there are banks. No business keeps all their revenue in a safe of their own. Banks are one of oldest shared private services.

My long winded point here is that for business already has established relationships with outside services that provide business and mission critical shared private services. These shared private ‘cloud like’ services have established a high level of trust and confidence for business. If a business has lost trust in FEDEX you can switch to UPS. Not getting more bars in more places, you switch over to Verizon.

Why should business’s data and applications be any different than a business’s product delivery or where a business keeps their money? This is not a advocating to toss out all the data centers in the world. I have IT brethren who would be picketing outside my house. My position is that a business should consider treating and viewing data and applications in the same manner they treat and view their products, money and facilities. Cloud computing is an option with pros and cons, like having your own fleet of delivery trucks or using a logistics provider.

Why does a business need to have a stepping stone like private cloud computing? No technical reason that I can fathom, only cultural.

Advertisements

Project Management – Complex or Simple – how to be both…

Let’s implement project management, and the crowd goes wild. What that is not the response you have experienced at your organization. Well, why is that? Like eating more vegetables, fish and drinking more water has a wealth of evidence that it is will provide a benefit to your health. A good, standard project management process will clearly benefit an organization’s top and bottom lines.

I wrote last week about aggressively killing failing projects to improve project success. This spurred a discussion with a colleague of mine about why organizations continue to resist project management. Other processes are resisted, like all change. However, project management seems to have a place of it’s own. Is it because it often comes out of the IT function? Us IT folks, like engineers, like the way end-users have to clearly define their needs before starting that a project management process requires. Thus, other function areas see it as a IT and not a corporate process.

Perhaps it is because other functions do not have as many cross-function activities (aka projects) as IT. Yet, I suspect another issue is at play here.

Go through a Project Management Institute certification training program and the details of the processes covered are deeply complex and very very comprehensive. Add to this a project management software that is just so intuitive with work breakdown structures, resource allocation sheets, Pert and Gantt charts. From a non-IT or non-engineering perspective this level of complexity and comprehensiveness is tautological and protractive.

Other processes that are introduced in a business use the baby step method of initially simple and intuitive, then as staff develop expertise, more complexity and comprehensiveness is added to expand the effectiveness of the process. A purchasing process starts off with a simple request form, than adds multiple quotes, perhaps a multi-vendor electronic catalog, tracking to budget lines and in the end could be like Amazon with recommendations of what others have purchased when they requested similar items.

However, a project process often starts off as highly complex and comprehensive. An immediate reaction would be, whoa, if it is this complex now, what is the process going to be like when we get good at it. Resistance is instinctive. Along with aggressive killing of failing projects, a strong communication that as staff and management get ‘good at’ project management, the process does the reverse of most other processes do.

The process relies less on complexity of intense detail scoping, project planning, cost estimates, change control and parametric analysis, along with the comprehensive recording of activities at the lowest task level. The process relies more on a staff’s expertise and intuitiveness.

If acquiring new businesses is a rarity, the complexity and detailed comprehensiveness of a formal project management methodology reduces the risk and improves the success of an activity that staff is inexperienced in performing. Now for an activity done often, say new product release, once the project process is introduced, commit to the full complexity and comprehensiveness for the next 2-3 product releases. Than leverage the experience and intuitiveness of the project teams success and failures to reduce the risk.

In short, project management process reduces risk. For new or rarely done activities, stick with complex and comprehensive. The same project process can be simple and intuitive for activities an organization has a wealth of experience performing.

Combine aggressive killing of projects with an equally aggressive approach to make the process simple and intuitive for greater project success.

Increase project success by aggressively killing projects

After x number of years with formal project methodologies, project software tools and training upon training upon training, companies continue to struggle to manage projects. Combined with this struggle is a culture that has a strong aversion to failure (aka risk). I put forth that these two are deeply tied. A business will not improve their project success through a new methodology, new tracking tool, additional training or even replacing the staff with new, more highly skilled staff.

What will improve project success is aggressively killing projects that are not succeeding. Call them challenged, failing, behind schedule, in the yellow or red. If they have missed more than one milestone, they are a candidate to be killed. These projects need to be put under a microscope and examined for the likelihood they will hit their next milestone. If the intense review finds that the project will not be able to meet the next milestone it should be killed. But, remember killing it does not mean you skip the closure phase. Close out the project capturing all the necessary learns and put the resources onto the projects that are succeeding to optimize their chance for success.

Making people work on projects that are failing, increases the pressure to succeed, but does not provide the environment for success to occur. Fail to kill projects and you are begging to be lied to by your project teams. Aggressively killing failed project, completing the closure phase and placing people on projects that are succeeding demonstrates you walk the talk of project methodology and that mistakes are acceptable.

The benefit of an aggressive killing of projects has the positive impact to the bottom line through increase success of other projects and cost avoidance from continuing projects that will fail. Add to this benefit the cultural impact. A fear of failure not only adds stress to a project that is already struggling, but it also creates an overall environment that stifles innovation. People will resist proposing new ideas unless they are guaranteed to have success. How many projects are guaranteed to be successful?

Therefore, aggressively killing projects impacts that bottom line through costs savings and the top line through supporting an environment of innovation by removing the fear of failure. If you find these thoughts on project management helpful and would like to not only learn about solid project management, but not have to read boring project management book to learn it, read The Deadline by Tom DeMarco. Gives you a understanding of good project management but in a fun to read novel format.

Why the cloud is not for everyone.

This is the year of the cloud, right. That is what 2009 is been hyped up to be. Forget about data centers and those virtualization projects. Move everything – applications, data, and security – all of it to the cloud. Data centers are a utility just like power and water. You don’t produce your own power and water, do you? For certain number of businesses this is the case. Their IT is mostly a utility which uses the phrase ‘keeping the lights on’ more often than not when describing many of their activities. If that is your IT, stop reading now and please enjoy one of my other entries.

However, for those who leverage their IT applications and infrastructure to not only drive their internal performance, but to deliver products and services to their customers. Well, the cloud could be a problem. After all, part of the reason your customers purchases your products and services is their trust in your ability to deliver that product or service. That trust is build from a variety of factors, but one is your customer’s desiring to verify your ability to deliver. If you are signing a contract to purchase 1,000,000 tires, you would like to see either the warehouse with 1,000,000 tires or the plant that has the equipment lined up to produce 1,000,000 tires.

Instead, if you are pulling content or having it pushed to your business, you will want to see that stack of servers, network hardware and system administrators that will provide that content. If your business offers a service evolving highly confidential data, than add to that physical and logical security measures. After all, have you ever opened a bank account where they did not have a vault visible. You went to the bank because you trusted the name (aka the brand). However, when you walk in and see that it is an actually building (normally made with brick, go figure), people working there processing transactions and yes, there where you can see it, with a nice thick metal door.

Can you deliver most services utilizing a cloud model? Without a doubt you can and there are business’s that are making a strong success out of that structure. However, if your brand of the products and services rely heavily on a trust and verify of your customers due to a highly confidential nature of your product or services let’s make sure you have all the bells and whistles to allow your customers to verify that a reliable and secure business transaction will occur.

Yet, those whose business model does not support a movement to the cloud, can still learn greatly from the cloud. Effective cloud services revolve around a pay as you go model. Not only adding storage, processing and bandwidth as you need it for your business, but also subtracting when you don’t. You have a sudden surge in storage needs, with the cloud you may send a email to request additional space or just start using additional space and next months bill will be somewhat more costly. Use less, well than less costly.

If your business has your own data center, you can without too much struggle add capacity with good agreements with your vendors to quickly supply equipment. With a strong virtualization model design you can also remove that capacity when the requirement has ebbed. There would be a minimum use time (likely 30 days) for the use of the equipment, but your annual cost would be more closely matched to your business utilization and therefore leading to a stronger profit level.

If that is not fast enough for your business to both add and subtract capacity, a hybrid cloud/data center model would be effective way to balance out costs while maintaining a strong brand protection for your products and/or services.

Give Cisco, Checkpoint, Dell, HP and/or IBM a call and ask them about making their infrastructure more cloud like and lowering your TCO while driving higher performance and reliability. Put pressure on them to develop a strategy that allows your business to utilize all of your infrastructure at peak levels and add or subtract components as your business needs on a monthly basis.