Technology Planning via Life Cycle Management

In a previous post, I introduced the idea that technology planning and budgeting can be a paradox.  The opposing forces of the business’ need to budget and technologies rate of change, can result in a lack of a coherent trajectory for technologists.  When I was first tasked with the responsibility of managing a technology group, the primary means of developing a multi-year technology budget was to leverage a life cycle management approach.  Essentially we assessed the amount of money spent on equipment and determine the number of years this equipment should provide usable service.  With these two values, we could then fill out a multi-year replacement budget plan.  For instance, if we spent $40,000 on a firewall and it should provide us useful service for 4 years, then every four years we’d need to spend $40,000 to replace the firewall.

The primary benefit of this approach to budgeting, forecasting, or planning is that it establishes a predictable funding schedule.  Of course this schedule is based on the assumption that what we spent previously for the technology, is what it will cost to replace the technology.  Moore’s Law is the chief contributor to this assumption, and for many technologies this assumption is valid.

The difficulty of such an approach is that if doesn’t address incremental upgrades or enhancements, which fall outside the life cycle window.  Another difficulty is that this approach fails to encapsulates features or service differences or opportunities.  For example, when I first started purchasing wireless equipment the scope of this function was narrowly defined to provide an overlay or convenience.  Today, wireless networks are the primary networking medium students, faculty, and staff use for internet connectivity.  Had we solely utilized the life cycle management approach to budgeting, we’d could never appropriately fund the equipment to meet the increased demand and additional features.  A third difficulty with this type of approach is that the default motivation to replace equipment is because it’s funded.

While not perfect, leveraging life cycle management can be useful in developing technology spend forecasts.  It’s relatively easily to construct, give the amount of industry information related to replacement cycles.  However, such an approached doesn’t really provide a narrative or articulate where the technology or an organization’s use of the technology is going.


Technology Budgeting Paradox

August 24, 2015 1 comment

As one of the IT Directors for a university, I often find myself straddling the demands of technology and the needs of the business.  Often, this precarious balancing act reaps benefits when technology can be leveraged to address business challenges.  Of course there are occasions when these two worlds end up contradicting each other, and I’m strung by opposing forces.  This later experience typically rears itself when the business is crafting a multi-year budget, and they are asking for technologists to identify spending needs for the next five or ten years.  The technology practitioners, given this request, throw up their hands in disgust proclaiming that the rate of change within technology makes it impossible to predict spending patterns.

To the technologists credits, it’s unlikely we could have predicted the impact that mobile devices would have had in 2002, five years prior to the release of the Apple iPhone.  It’s just as unlikely that we could have foreseen the impact that cloud computing and storage would have had on data center operations in 2005.  But just because we can’t predict technology shifts and changes, shouldn’t preclude us from making educated guesses or charting a course based on the best information available.   It doesn’t require a fortune teller’s crystal ball to look at a trajectory and ask financial questions about where we are headed.  It also doesn’t mean that we can’t make necessary course corrections along the way.

Changes occur rapidly.  The business needs to plan.  Execution requires planning and coordination.  So like it or not, we should be able to pen a budget that is multiple years long and have specific dollar amounts associated with each year.  This process helps the business, and gives technologists a base line to assess deviations as the months and years tick off.

Battery Life

November 19, 2014 Leave a comment

In the last couple of days, I’ve had the awkward experience of my iPhone only lasting about two hours on a full charge.  The degradation of the battery life wasn’t problematic until I attended a seminar.  During seminars, I typically tweet and check emails, etc.  However, with a smartphone that I could visibly see the battery life trickling down, I suddenly found myself reenacting the Samsung “Wall Hugger” commercial where I was searching for and huddled around those precious power outlets.

I suppose what has been most surprising with this battery life issue, is the realization how often I use my smartphone, or more precisely how frantic I feel when the device has no juice left to operate.  I now find myself ensuring I have a charger, cable, car adapter, and fully charged battery-charger stowed away in my backpack.  Curious how complicated life has become, just because my smartphone battery drains quickly.  At some point, this device that has been such a useful tool, has suddenly become resource intensive.

The experience begs the question, what else am I doing, which once fell into the useful bucket, but has since transitioned into the “too much trouble” bucket?

University as Service Provider

October 16, 2013 Leave a comment

According to an informational sheet from, which cites data from the department of education, approximately two-thirds of students who attend college live “off-campus”.  In 2009, the Fiscal & Economic Research Center at the University of Wisconsin-Whitewater publish a report which highlighted that a majority of students who live off-campus, actually live within one to two blocks from the university.   For years, universities have sought ways to capture the “students living off-campus” market, whether it be via university owned property (rental income), food services (meal plans), and information technology (telephone services and broadband network access).

When many universities were making a killing with long distance resell, some universities were able to capture this off-campus student market through a PBX service known as Direct Inward System Access (DISA).  Other universities explored options of becoming a Competitive Local Exchange Carrier either by establishing their own facilities or leveraging the Unbundled Network Elements provision of the Telecommunications Act of 1996.

When broadband internet access began taking shape, some universities began looking at their Instruction Television Fixed Service Channels (ITFS) as an option to provide wireless network connectivity to students who resided in close proximity to the university.  Recently, as the FCC mandated Television stations to transition to digital service, Television White Spaces (TVWS) has developed as a license-free option for the delivery of wireless internet service.  In fact, there is an initiative by the founders of Gig.U to leverage TVWS to deliver what is being labeled as “Super-WiFi” through the Advanced Internet Regions consortium (AIR.U).  One of the first pilots of AIR.U is being conducted in Morgantown, West Virginia through a partnership with West Virginia University.  This new offering may open the door for universities to offer wireless broadband internet service to their students who live within five miles of the university.

Of course there are always philosophical questions about what business should the university be in.  Is the university in the services business, or is it in the business of educating students?  The question I propose is whether these two businesses are mutually exclusive?  When it comes to students living on-campus, universities typically have no issues with providing auxiliary services and quantifying them as “quality of life” services.  So why can’t we use the same rationale, and extend these services to students who don’t reside within the university’s acreage?


October 15, 2013 Leave a comment

In the competitive world of higher education admissions, one of the key indicators universities research each year is student yield.  This is a comparison of the number of students who were admitted to the school versus the number of students that actually enrolled.  To promote higher yields, universities often work extensively with admitted students to turn them into enrolled students.   This work includes scheduling an orientation visit, promoting opportunities for admitted students to meet with other admitted students, offering attractive financial aid packages, and making it easy for students to navigate the complexities of transitioning to college life such as selecting a roommate, selecting a residence hall, selecting a major, selecting courses to take, etc.

This process of turning admitted students into enrolled students is also called “onboarding”.  For the university where I’m employed, one of the first “onboarding” tasks once a student is admitted is to create a university email account, which will give them access to a portal that houses many resources, forms, and information that will guide them through the “onboarding” process.

Last week, I attended a conference where a presenter from the University of Kansas spoke about the university’s efforts to adopt lessons and structures from their student onboarding process and apply these to a faculty and staff onboarding process.  As a hiring manager, I thought this was an incredible idea.  In my experience, the first 30 to 60 days after a new hire starts are often lost to typical “becoming familiar” tasks.  In contrast, a new college student doesn’t spend the first month of school becoming familiar.  Maybe the first day of classes, but by the second class meeting, a student has reading assignments and knows project and paper deadlines.  Granted most colleges and universities have summer orientation events and a week of scheduled events prior to the first day of classes for students to become acclimated to university.  But for new hires, often the several weeks are filled with simply trying to collect the appropriate approvals, access, equipment, and contacts for them to do their jobs.

The idea of establishing an onboarding process to assist in streamlining the amount of time and energy necessary to accomplish the tasks necessary to begin working on tasks and goals referenced in their job description seems to be a “no brainer”.  Sure there are limitations, but there are many things provided to students during their onboarding process, that could be leveraged for faculty and staff.  Technology resource allocations, parking permits, benefit signups, direct deposit forms, getting business cards … all of these tasks could easily be gathered into an onboarding portal where new hires to access between accepting their job and starting their first day.

For students, the onboarding process begins with being admitted.  For new hires, the onboarding process would begin with the hiring manager executing the proper paperwork.  This may be the first battle to establishing an onboarding process.  Hiring managers would need to understand that before anything could happen, they’d need to process the paper work so the new hire can access the onboarding website.  This may need special attention and potentially a culture shift.   But if means a new hire can be productive earlier, I suspect hiring managers would be more willing to commit to getting the process started.

Making Big Rocks out of Little Rocks

September 25, 2013 Leave a comment

Today, I had a dialog with a visiting consultant about some of the struggles related to the constant demands IT organizations face.  We agreed there are “big rocks” and we typically address these with some significant and often appropriate measure.  However, it’s the “little rocks” which often are neglected and generate ill feelings from our customers.

While many people are familiar with Stephen Covey’s Big Rock analogy … fill your bucket first with the big rocks, and then follow this up with the little rocks.  We shared the assessment that this was a good start, in most of our lives not all the little rocks fit into the bucket.  Whether we are willing to admit it or not, Our bucket … time, attention, capacity, etc … doesn’t exist in a vacuum.  We aren’t static buckets, so even by focusing on the big rocks first, and following this up with little rocks, often we can’t fit in all the little rocks.  So while we have committed to the big rocks, there comes a time where we must prioritize which of the little rocks will make it into our bucket.

This is where the consultant offered a nugget that really struck a chord with me.  His advice is to make big rocks out of the little rocks.  He offered that often there are requests, activities, assignments, or opportunities that share cohesion because with each other.  He liken it to a mash up or a product offering that is made up of several different components.  When these cohesive little rocks are rolled up together, it takes the form of a big rock, and in turn gets more of our attention.  The consultant cautioned that by building big rocks out of little rocks, we don’t necessarily address the question, how do we not neglect the little rocks.  Rather, we have addressed how we prioritize which little rocks we will address and which we have to neglect at the moment.

The other benefit of this idea, is that we can communicate our framework … focusing most of our effort on the big rocks.  By telling customers and partners where our effort will reside, we essentially inform them that there will be rocks which will be neglected.  If we also offer the insight that little rocks can be made into big rocks, via cohesion, we invite our customers to assist in finding the cohesion that brings little rocks together to build big rocks.  Hopefully the invitation for others to participate in identifying the cohesion, will result in more of our time to tackle the little rocks that don’t necessarily share cohesion with other little rocks.

Making a Case for Face-to-Face Classroom Training

September 11, 2013 Leave a comment

As a technologist, I believe it is vital to continue to develop my skills and competencies through formal and on-the-job experiences.  As a technology manager, I strive to cultivate an atmosphere where my team recognizes the value of developing their skills and expertise.  As a leader within a technology trade association, I’ve heard various perspectives identifying that opportunities to network with peers and acquiring professional certifications are strong motivators for association participation.

However in recent years, I’ve found it difficult to find appropriate classroom training programs catered to technology professionals.  In the last six months, almost half my staff have encountered a situation where an off-site technology class has been canceled because not enough people signed up for the course.  More recently, I purchases a new network access control system that included a training component.  To my surprise the training was only available online.

I maybe showing my age, but I have a certain bias toward structured, off-site, classroom training.  In my opinion, these types of situations offer tremendous value for technologist.  I will concede only a small portion of the value emanates from the trainer leading the course.  The majority of the value comes from the other attendees who are participating in the course.  Listening to the stories of others, who often have a different experience with a vast array of products and technologies, makes up for any lost time or workload disruption.  In my limited experience with online training setups, this interaction with other classmates Is a deterrent.  Wikis, knowledge bases, Google searches, and YouTube videos have the potential to pass on technical information, but fail to capture the perspectives and experiences of a classroom full of technologists whose experiences, observations, and  opinions typically differ considerably from my own.

So where does a technologist find this melting pot of valuable knowledge in a market that is moving away from my preferred method of knowledge transfer?  Even if I can find classroom style training options, more often than not, they fail to sell sufficient seats to justify the expense of hosting a course.  Is this an opportunity for trade associations to revise their stance on vendor agnostic requirements, and begin pursing the finite training allowance with manufacture and product specific courses which incorporate some level of certification or professional development credit?  Or should technologists resign themselves to the realization that with newer delivery methods,  they will have to find another source to listen to the war stories from colleagues?