Git and Gitflow References

November 16, 2015

Getting Started with Git (Microsoft):

A Step by Step Guide to using GitFlow (endjin):

Gitflow Tutorial (Atlasssian):

Git Branching Model (Vincent Driessen):

SourceTree App (Atlasssian)

Engineering Effectiveness

October 8, 2015

Recently stumbled across an awesome blog post from Peter Seibel @peterseibel the tech lead of Twitter’s Engineering Effectiveness group entitled Let a 1,000 flowers bloom. Then rip 999 of them out by the roots.  It is written version of a talk he gave at the Facebook @Scale conference.  It is a bit on the wordy side but there are some real interesting nuggets, a bit of insight into the history of Twitter and some very witty analogies.   Here are a few of the highlights.

  • We know how to build abstractions and modularize our code so that we can manage large code bases and how to deploy our software so it can handle the demands of millions or even billions of users. On the other hand, I’d argue that we don’t really yet have a good handle on how to scale that area that exists at the intersection of engineering and human organization—the place where groups like Engineering Effectiveness work.
  • I think a big part of the problem is that we—as an industry—are not very good about thinking about how to make engineers effective.
  • The Twitter EE motto is: “Quality, Speed, Joy”. Those are the three things we are trying to affect across all of Twitter engineering. Unlike that other famous triple, Fast, Cheap, Good, we believe you don’t have to pick just two.
  • We know from Dune that fear is the mind killer. So how does fear manifest in the context of software development? I would say tech debt. Tech debt is the mind killer. Tech debt is the lack of quality. It slows us down. It makes us miserable.
  • In order for engineering effectiveness engineers to be able to boost effectiveness across all of engineering, things need to be standardized.
  • Your goal should be to pick the set of tools and processes you will support and support the heck out of them. Invest more than you probably think you need to and focus relentlessly on making the tools and processes you do support awesome.
  • Finally there’s a psychological aspect to providing good tools to engineers that I have to believe has a really impact on people’s overall effectiveness. On one hand, good tools are just a pleasure to work with. On that basis alone, we should provide good tools for the same reason so many companies provide awesome food to their employees: it just makes coming to work every day that much more of a pleasure. But good tools play another important role: because the tools we use are themselves software, and we all spend all day writing software, having to do so with bad tools has this corrosive psychological effect of suggesting that maybe we don’t actually know how to write good software.
  • We don’t even really know what makes people productive; thus we talk about 10x engineers as though that’s a thing when even the studies that lead to the notion of a 10x engineer pointed more strongly to the notion of a 10x office. But we’d all agree, I think, that it is possible to affect engineers’ productivity. At the very least it is possible to harm it.

All of this makes a ton of sense and is very complementary to two intersecting industry trends – DevOps and Dev in Test.  If you agree that agile is at the heart of DevOps – operations and administration – engineering effectiveness is an enabler.  A fundamental premise of DevOps is to minimize work in progress.  Let’s extend that model to tech debt – minimize tech or mental baggage.

Similarly, Dev in Test are test engineers that are part of the development team.  Again the idea is to allow the organization deliver value to customers faster.  An engineering effectiveness group or even a single engineer is another set of hands to streamline the efforts of the main line development team.

My one quibble with Seibel’s assertions is the apparent questioning of the existence of the 10X engineer as if they are like the Loch Ness Monster.  On the contrary, 10X engineers are as real as Murphy’s Law.  Managers are well served optimizing their contributions any way that they can whether that be with the best available tooling, minimizing unnecessary activity (i.e., meetings), and anything that takes them away from the code.

Revisiting Augustine’s Laws

August 19, 2015

Augustine’s Laws is a collection of management insights first published in the mid-1980’s by former undersecretary of the Army and CEO of defense contractor Martin Marietta Norman Augustine. It contains 52 (one per week) “laws” of management that Mr. Augustine picked up in his many years working in government and the defense industry.  Each law is written in the form of a humorous vignette that is meant to stand on its own.  The book is still available via Amazon (though at a premium) and given its substantial enduring wisdom is surprisingly hard to find through the library system.

Most of the book is specific to government contracting circa late 20th century but some of the insights are just as applicable today as they day they were first written.  The canonical list of the laws are available at Wikipedia.  Here are some of the more interesting ones:

  • Law XV (aka Law of Insatiable Appetites) – The last 10% of performance generates one third of the cost and two thirds of the problems.
    • Corollary 1: The price of the ultimate is very high indeed. Sometimes it would seem that one might be better served by having a more of a little less.
    • This is very similar to George Patton’s statement – “A good plan, violently executed now, is better than a perfect plan next week.”
  • Law XXIII (aka Law of Unmitigated Optimism) – Any task can be completed in only 1/3rd more time than is currently estimated.
    • Corollary 1: If a schedule is three quarters complete only on third time remains
    • Corollary 2: When it comes to schedule adherence everything is relative.
    • Corollary 3: The sooner you start to fall behind the more time you will have to catch up.
  • Law XXIV (aka Law of Economic Unipolarity) – The only thing more costly than stretching the schedule of an established project is accelerating it, which is itself the most costly action known to man.
  • Law XXXV (aka Law of Definitive Imprecision) – The weaker the data available upon which to base one’s conclusion, the greater the precision which should be quoted in order to give the data authenticity.
  • Law XXXVII (aka Law of Apocalyptic Costing) – Ninety percent of the time things will turn out worse than you expect. The other 10% of the time you had no right to expect so much.
  • Law XLVIII (aka Law of Oratorical Engineering) – The more time you spend talking about what you have been doing, the less time you have to do what you have been talking about. Eventually, you spend more and more time talking about less and less until finally you spend all of your time talking about nothing.

The perspective of the book is that of a senior manager working on large defense programs in the late 1970s and early 1980s.  While there are certainly universal truths, much has changed in the intervening thirty years – particularly in the field of software development.  Today software is generally built incrementally by self-directed teams using a flavor of agile.  Most agile teams live by the credo that the best way to eat an elephant is one bite at a time.  Agile is popular not because the problems are any less changing – indeed application complexity is increasing not decreasing – but because it provides for predictability that simply is not possible with massive projects.

As interesting as laws are, the management observations in the last chapter are as relevant today as they day they were written – if not as pithy.

  • People are the key to the success in most any undertaking, including business.
  • Teamwork is the fabric of effective business organizations.
  • Self-image is as important in business as in sports. A corporate team must think of itself as a winner.
  • Motivation makes the difference.
  • Recognition of accomplishment (and the lack thereof) is an essential form of feedback.
  • Listening to employees and customers pays dividends – they know their jobs and needs better than anyone else.
  • Delegation, wherever practicable, is the best course.
  • Openness with employees and customers alike is essential to building trust.
  • Customers deserve the very best.
  • Quality is the key to customer satisfaction.
  • Stability of funding, schedules, goals and people is critical to any smooth business operation.
  • Demanding the last little bit of effort from oneself is essential – it can make the difference against competitors who don’t have the will to put out the extra effort.
  • Provision for the unexpected is a businessperson’s best insurance policy.
  • “Touch-Labor” – people who actually come into contact with the product – are the only certain contributors in any organization.
  • Rules, regulations, policies, and reports, and organization charts are not a substitute for sound management judgement.
  • Logic in presenting decision options, consequences, benefits, and risks is imperative.
  • Conservatism, prudent conservatism, is generally the best course in financial matters
  • Integrity is the sine qua non (indispensable and essential action, condition, or ingredient) of all human endeavors including business.


Thoughts RightScale Annual State of the Cloud Report

May 2, 2015

In January of 2015 cloud portfolio management company RightScale Inc. surveyed 930 users of cloud services for their Annual State of the Cloud report.  The findings are both interesting and insightful.  Several key findings are highlighted here.

  1. Cloud is a given and hybrid cloud is the preferred strategy. According to the survey 93% of respondents are using the cloud in one way or another.  Further more than half (55%) of enterprises are using hybrid clouds – either private clouds or an integration with on premise solutions.
  • Savvy start-ups realize that public clouds can be expensive relative to self-hosting in an economy co-lo facility. Until traffic ramps to the point where the ability to immediately scale justifies it there is no urgency to host in AWS or Azure.
  • Public clouds are ideal for a variety of scenarios – unknown, unpredictable, or spiking traffic, the need to host in a remote geography, or where an organization has other priorities than to focus on hosting. Conversely self-hosting can be more economical.  Example, Amazon c3.2xlarge – 8 vCPU and 16 GB RAM (as of May 2015) is $213 / month or approximately $2500 / month / per server.  Organizations who already have an investment in a data center or have on premise capacity often find it cost-effective to self-host for internal applications.
  • Many enterprises are not surprisingly reluctant to walk away from significant capital investments in their own equipment. Hybrid clouds allow organizations to continue to extract value from these investments for tasks that may be difficult or costly to implement in a public cloud.  For example, high security applications, solutions which must interact with behind the firewall systems, or processing / resource intensive programs.

93% of Respondents Are Using the Cloud

  1. DevOps rises; Docker soars. DevOps is the new agile.  It is the hip buzz word floating around every organization.  According to Gene Kim, author of the Phoenix Project, DevOps is the fast flow of code from idea to customer hands.  The manifestation of DevOps is the ability to release code as frequently as several times a day.  To achieve this level of flexibility organizations need to eliminate bottlenecks and achieve what Kim calls flow.  Tools like Puppet, Chef, and Docker are enablers for DevOps.  In forthcoming surveys it can be expected that Microsoft’s InRelease (part of Visual Stuido Online) and Hyper-V Containers will have prominent roles in organizations that use the Microsoft stack.

DevOPs Adoption Up in 2015

  1. Amazon Web Services (AWS) continues to dominate in public cloud, but Azure makes inroads among enterprises. AWS adoption is 57 percent, while Azure IaaS is second at 12 percent.  (Among enterprise respondents, Azure IaaS narrows the gap with 19 percent adoption as compared to AWS with 50 percent.)  This is consistent with other market surveys – see Synergy Research Group Study from October 2014.
  • At this point the market has effectively narrowed to only two major cloud IaaS providers Amazon and Azure. While there are other offerings from Rackspace, IBM, HP and other non-traditional sources (i.e., Verizon) these seem to be solutions for organizations who already have a relationship with this vendor or there is a specific reason for going away from the market leaders.
  • There are certainly many other PaaS solutions including Google,, Heroku (owned by SFDC). Similarly there are many SaaS solutions again including Google Apps, NetSuite,, Taleo, and many other vertical specific solutions.
  • This respondent base is heavily represented by small business – 624 SMB vs. 306 Enterprise. Although Microsoft is working hard to attract start-ups the reality is that today most entrepreneurs chose open source technologies over Windows.  Conversely Microsoft technologies are disproportionately represented in larger enterprise.  While today AWS is the undisputed market leader Azure is growing quickly and can be expected to close the gap.  Microsoft is investing heavily in their technology, is actively reaching out to the open source community, and is making it apparent that they are not satisfied with being an also ran.
  1. Public cloud leads in breadth of enterprise adoption, while private clouds leads in workloads.
  2. Private cloud stalls in 2015 with only small changes in adoption.
  • Private clouds are being used for functional and load testing as well as hosting internal applications (i.e., intranet) where the costs and risks associated with a public footprint do not exist. It makes sense that where in the past organizations would have had “farms” of low end desktop PCs and blade servers in server closets that these types of applications have been moved to private clouds that are hosted on virtualized servers that can be centrally managed, monitored, and delivered to users more cost effectively.
  • It is interesting that the data suggests that the market virtualization infrastructure has matured and is not growing. The market leader in this space continues to be VMWare with Microsoft gaining traction in enterprises.
  1. Significant headroom for more enterprise workloads to move to the cloud. An interesting data point – 68% of enterprise respondents says that less than 20% of their applications are currently running in the cloud.
  • It will be interesting to see how his number changes over time. Reversing the statistic – 80% of enterprise applications are still run on premise.  This could be due to IT organizations heavy investment in capitalized equipment / data center.  It could be that the economics of a public cloud are still too expensive to justify moving to a public cloud.  There could be technical limitations such as security which are holding back cloud adoption.  Finally, there could be organizational prejudices against taking what is perceived as a risk to embrace the public cloud.  Very likely it is all of the above.
  • The role of a visionary CTO is to move their organization forward to embrace new technologies, break down prejudices, and find new and better ways to serve customers. Cloud vendors are working to make it easier for organizations of all sizes to adopt the cloud by lowering cost, increasing security, and providing new features which make management more seamless.
  • While this study does not provide any data on the breakdown of PaaS vs. IaaS it is a reasonable assumption that most enterprise adoption of the cloud is IaaS as this is by and large simply re-hosting an application as-is. PaaS applications on the other hand typically need more integration which in many cases involves software development.  Once done, however, PaaS applications are often more secure, scalable, and extensible as they take advantage of the hosting platform infrastructure.

Cloud Challenges 2015 vs. 2014

Finally, RightScale has a proprietary maturity model which ranks organizations comfort level with using cloud related technologies.  Interestingly the data suggests that nearly 50% of organizations have yet to do any significant work with the cloud.  This data can certainly be expected to change over the next 2-3 years.

Cloud Maturity of Respondents

Predictions 2015: CIOs Accelerate The Business Technology Agenda

February 21, 2015

At the end of 2014 Forrester published a short research note entitled “Predictions 2015: CIOs Accelerate The Business Technology Agenda.”  (The report is $499 if you get if from Forrester but Microsoft has apparently licensed the content to make it free.)  Some of the provocative ideas in the note include:

Many CIOs have the technical expertise and cross-functional business purview to help drive digital innovation, but they are too often still seen as the leader of a cost center.

This is the truth.   Not that it is much fun for anyone but for those of us in IT budget season is a constant battle for funding.  Many times the best outcome is level funding.  Generating revenue on our own would change the game.

Both consumer and business customers increasingly expect real-time access to connected product and service information. These expectations not only define customer engagements but also ripple throughout the supply chain — shortening product cycles, requiring more agile operational capabilities, and creating opportunities for new, disruptive digital services.

Even some of the smallest businesses – appliance dealers, car services, and even dentists offer on-line access to inventory, scheduling, and account information.  Larger companies that offer primitive or worse no access to this type of information absolutely stand out as laggards. Increasingly not only is the expectation that information will be on-line and accessible organizations without a credible mobile strategy are at a competitive disadvantage.

  • Prediction No. 1: CIOs Accelerate The Business Technology Agenda

There are a ton of buzz words buried in this predication including agile development, connected products, mobile services, and customer-focused governance models.  The reality that many of us live with every day is doing more with less.  Technology is lever to these new challenges.  For example, automation of tasks previously only done by humans can lead to dramatic cost and time savings.  Better, cleaner execution, using new development techniques (e.g., DevOps, automated testing) make it faster and more efficient for development teams to get code in the hands of customers.  Cross platform development is now realistic using tools like Xamarin.  Finally, highly reliable consumption based cloud delivery platforms are available from multiple different providers.  While the challenges have never been more formidable so too the tools at our disposal have never been more powerful.

  • Prediction No. 2: CIOs Unlock Data-Driven Business Opportunities

According to CSC there will be more than 35 ZB of data generated in the year 2020 – a 4300% growth from 2009.  It’s hard to visualize that much data.  Much of that information will come from the emergence of sensor data (think connected refrigerator) that has previously never been on-line or accessible.  While there will no doubt be new unstructured data such as this, traditional web and mobile applications (Facebook, Instagram, and Twitter) will continue to evolve and grow.  The dramatic growth of information calls for new solutions to make sense of it all.  Again there is cause for real optimism that technology is keeping pace; There are powerful NoSQL / NewSQL databases such as Cassandra, analysis engines such as Hadoop, and visualization platforms such as Tableau.  Many of these technologies are open source and most of the commercial solutions are reasonably priced – often tied to consumption.  One of the primary challenges is the imperative to get technology organizations thinking differently.  For example, it is fully possible to use Cassandra like a relational database but in so doing much of the value of the solution would be lost.

  • Prediction No. 3: CIOs Make CDOs Unnecessary

The notion of a Chief Digital Officer (CDO) is perhaps a role that exists in larger organizations.  Someone, whether it be a CDO or the CIO, needs to truly own the responsibility for protecting the organization’s customer data.  High-profile security breaches such what happened to Target, Sony, the US Army, and many others have made it apparent that this huge issue.  Organizations are under near constant attack from outsiders that at best want to put an offensive message on the home page and at worst want to hold it hostage for ransom.  Protection of data starts with penetration testing, applying best of breed solutions (FireEye, Palo Alto Networks), and aggressively patching vulnerabilities as they are found.  This all will maintain the status quo.  Forrester rightfully points out that the best organizations will find ways where the Chief Marketing Officer (CMO) will be tightly aligned with the CIO to leverage technology to move the business forward.

Become leaders of digital change – or be usurped. Somebody has to be in charge of increasingly connected and dependent technology for the enterprise.  Fast-cycle, tech-based innovation drives a coherent, cross-channel digital experience is crucial to succeeding in today’s markets. …  Are all CIOs up for the challenge? No.  But in 2015, any CIO who isn’t will be replaced by one who is.

As Zig Ziglar said “Success occurs when opportunity meets preparation.”

Thoughts on ORM Tools

January 14, 2015

The following is a summary of an email thread discussing Object Relation Mapping (ORM) Tools.  In my experience developers hold strong opinions about ORM Tools.  In a past life my organization used LLBLGen and the folks that were most informed on ORM tools had strong opinions that it was much better than both nHibernate and Entity Framework.   As a conversation starter I provided two articles from December of 2013 and follow up from February 2014 comparing the various ORM / Data access frameworks.  I wanted to see where my organization stood on the topic of ORM.

As expected there were strong opinions.  I found that there were essentially two camps – believers and non-believers. Interestingly the group (of about 10 very well informed senior folks) were evenly split on their opinions as to whether ORM is worth the effort or not.  Also very interesting was that there was little disagreement about the pros and cons of ORM.


The “Believers” are proponents of Microsoft’s Entity Framework.  I am apparently the only one to have ever used LLBLGen.   Somewhat surprisingly no one in the group had any significant experience with nHiberate.  Some had some passing experience with micro ORMs Dapper and Peta Pocco.  Believers say that the savings achieved by having a clean, very flexible data access layer code is worth the investment in the overhead in maintaining the ORM.  Their argument is that investment in tuning the ORM is smaller than the productivity gains achieved from its usage.


This group believes that the overhead associated with maintaining an ORM tool does not justify the return on the investment.  They believe that stored procedures connected to the database using custom data access layer code written in ADO.NET are best.  Some have built code templates to help generate and maintain their Data Access Layer.  This believe this really helps us on our efficiency while keeping full control on the code/execution.

Pros and Cons

There was broad consensus around the pros and cons of ORM – again based on experience with Entity Framework version 5 and 6.

Pros Cons
Relatively straight-forward. It has good default conventions and rules. Hard to fine tune EF (e.g. query optimization). In half cases it ends up writing SQL manually and executing it from EF context.
Functional – it implements 3 approaches (code- model- database- first), inheritance support, eager and lazy loading. Not very good for complex models. SQL queries become very large (could be up to several pages) and hard to understand.
Flexible. It’s possible to change conventions and rules; select only needed relations. Slow to fetch large datasets (thousands of rows).
Not suitable for batch operations (insert, update, delete)

Net net

There are a range of problems where ORM would be a good solution and others where it would not.  Small, relatively non-transactional applications seem to be a good fit.  As the volume of data grows the value gap narrows to well-done hand crafted SQL.  The tradeoff is obviously the cost of having simple changes take more time to implement and test than with something like EF.

ORM seemingly can be made to work for most applications – the question is at what cost.  Hand coding SQL might not make sense for an application with hundreds of database tables.  On the other hand ORM might not make sense for a highly transactional database.   In the end my sense is that this comes down to people and your architect’s preference.  The choice of an ORM is like choosing a platform – .Net MVC or Ruby on Rails, SQL Server or MySQL, Linux or Windows.  While there are some people out there who can easily move between platforms in my experience developers have preferences and comfort zones.  The choice of whether to use and ORM Tool and if so which platform to use is both an application and a personal decision.


Cloud computing is as much a mindset as it is a technology strategy

August 21, 2014

An analogy I’ve used in the past is that were Marc Benioff (CEO of – arguably the most forward leaning organization in the world of enterprise cloud computing) to acquire a business built around a collection of “stove pipe,” on-premise solutions one of the first things he would do is dictate that everything must to be moved to the cloud. Marc Benioff was one of the first CEOs to truly “get it.”  Everything his organization does is viewed from the lens of the cloud.

Increasingly, it is clear that there is a viable and economical cloud computing solution for nearly every situation.


Organization Size SecurityReq. Existing Investment IT Group
Entrepreneurs / Sole Proprietorships Basic $ None
Startups Varies $ Developer
Small Business (less than $1M in annual sales) Varies $ Shared
Small-to-medium Business ($1-$10M in sales) Varies $$ Maybe
Mid-sized Companies ($10M-$100M in sales) Medium $$$ Likely
Medium-to-Large Companies ($10M-$100M in sales) High $$$$ Yes
Enterprises ($1B+ in sales) High $$$$$ Definitely

* Columns are meant to represent typical situations

To some there are as many reasons not to embrace the cloud as there are to adopt it.  It is a matter of perspective.  Those positively disposed to a SaaS strategy will find a way.  Those who are not will find reasons (excuses) why it does not make sense, is too costly, or would be too much of a distraction.  Where there is a will there is a way, however, the size of the organization (which is often a proxy for existing investment) reasonably weighs on the level of difficulty of making the leap to the cloud.

Existing investment – For smaller entities or individuals there may be cheaper alternatives than hosting in the cloud, however, the effort associated with maintaining hardware and the ease of scaling are reasons why a cloud-based solution makes sense over the long term.  For larger entities and enterprises where there is an existing investment in other technologies such as a datacenter that cannot be easily abandoned, a hybrid hosting strategy often makes sense.

Security – A common reason cited not to use a cloud based solution is that customers demand a higher level of security than can be achieved in a public cloud.  Organizations like and Google (Apps) have demonstrated for all the world to see that even the most sensitive information can be held securely in a public cloud.  Arguably, when properly configured, information in a public cloud is as well if not more securely held than in an on-premise solution.  Other than high security government data there is little information which cannot be “trusted to the cloud.”

Cost – The cost profile of hosting an application in the cloud is indeed different than if hosted locally.  There are less upfront costs and expenses are tied to usage.  For larger entities there is a material difference in how the costs are accounted. Cloud costs are operating expenses that need to be incurred in the current reporting period where much of on-premise costs can be capitalized and depreciated over a number of years.

Complexity – Hosting in a public cloud poses new challenges to individuals and IT organizations which may have become accustomed to owning their own hardware.  The concept of deploying to the cloud, ensuring proper security, scaling, and hundreds of other tasks are new skills that IT organization have to acquire.  Those that do will thrive.  Those that do not will be left behind.


FUD – Fear, Uncertainty, and Doubt is an expression that has been used in the computer industry since the mid-1970s to refer to a technique used by those who seek to stymie the adoption of new technology due to a lack of understanding.  One of the first applications of FUD was by IBM sales people to undermine the confidence of buyers in their competition.  For those with a stake in the status quo moving to the cloud represents at best more work and a worst increased cost, down time, and frustration as they figure their way through new technology.

In reality would likely never acquire a business so wedded to what they would view as an antiquated technology strategy.  When viewed in this way the degree to which an organization has embraced the cloud can be viewed as a competitive advantage or indeed a disadvantage.


Get every new post delivered to your Inbox.