Thoughts RightScale Annual State of the Cloud Report

May 2, 2015

In January of 2015 cloud portfolio management company RightScale Inc. surveyed 930 users of cloud services for their Annual State of the Cloud report.  The findings are both interesting and insightful.  Several key findings are highlighted here.

  1. Cloud is a given and hybrid cloud is the preferred strategy. According to the survey 93% of respondents are using the cloud in one way or another.  Further more than half (55%) of enterprises are using hybrid clouds – either private clouds or an integration with on premise solutions.
  • Savvy start-ups realize that public clouds can be expensive relative to self-hosting in an economy co-lo facility. Until traffic ramps to the point where the ability to immediately scale justifies it there is no urgency to host in AWS or Azure.
  • Public clouds are ideal for a variety of scenarios – unknown, unpredictable, or spiking traffic, the need to host in a remote geography, or where an organization has other priorities than to focus on hosting. Conversely self-hosting can be more economical.  Example, Amazon c3.2xlarge – 8 vCPU and 16 GB RAM (as of May 2015) is $213 / month or approximately $2500 / month / per server.  Organizations who already have an investment in a data center or have on premise capacity often find it cost-effective to self-host for internal applications.
  • Many enterprises are not surprisingly reluctant to walk away from significant capital investments in their own equipment. Hybrid clouds allow organizations to continue to extract value from these investments for tasks that may be difficult or costly to implement in a public cloud.  For example, high security applications, solutions which must interact with behind the firewall systems, or processing / resource intensive programs.

93% of Respondents Are Using the Cloud

  1. DevOps rises; Docker soars. DevOps is the new agile.  It is the hip buzz word floating around every organization.  According to Gene Kim, author of the Phoenix Project, DevOps is the fast flow of code from idea to customer hands.  The manifestation of DevOps is the ability to release code as frequently as several times a day.  To achieve this level of flexibility organizations need to eliminate bottlenecks and achieve what Kim calls flow.  Tools like Puppet, Chef, and Docker are enablers for DevOps.  In forthcoming surveys it can be expected that Microsoft’s InRelease (part of Visual Stuido Online) and Hyper-V Containers will have prominent roles in organizations that use the Microsoft stack.

DevOPs Adoption Up in 2015

  1. Amazon Web Services (AWS) continues to dominate in public cloud, but Azure makes inroads among enterprises. AWS adoption is 57 percent, while Azure IaaS is second at 12 percent.  (Among enterprise respondents, Azure IaaS narrows the gap with 19 percent adoption as compared to AWS with 50 percent.)  This is consistent with other market surveys – see Synergy Research Group Study from October 2014.
  • At this point the market has effectively narrowed to only two major cloud IaaS providers Amazon and Azure. While there are other offerings from Rackspace, IBM, HP and other non-traditional sources (i.e., Verizon) these seem to be solutions for organizations who already have a relationship with this vendor or there is a specific reason for going away from the market leaders.
  • There are certainly many other PaaS solutions including Google, Salesforce.com, Heroku (owned by SFDC). Similarly there are many SaaS solutions again including Google Apps, NetSuite, Salesforce.com, Taleo, and many other vertical specific solutions.
  • This respondent base is heavily represented by small business – 624 SMB vs. 306 Enterprise. Although Microsoft is working hard to attract start-ups the reality is that today most entrepreneurs chose open source technologies over Windows.  Conversely Microsoft technologies are disproportionately represented in larger enterprise.  While today AWS is the undisputed market leader Azure is growing quickly and can be expected to close the gap.  Microsoft is investing heavily in their technology, is actively reaching out to the open source community, and is making it apparent that they are not satisfied with being an also ran.
  1. Public cloud leads in breadth of enterprise adoption, while private clouds leads in workloads.
  2. Private cloud stalls in 2015 with only small changes in adoption.
  • Private clouds are being used for functional and load testing as well as hosting internal applications (i.e., intranet) where the costs and risks associated with a public footprint do not exist. It makes sense that where in the past organizations would have had “farms” of low end desktop PCs and blade servers in server closets that these types of applications have been moved to private clouds that are hosted on virtualized servers that can be centrally managed, monitored, and delivered to users more cost effectively.
  • It is interesting that the data suggests that the market virtualization infrastructure has matured and is not growing. The market leader in this space continues to be VMWare with Microsoft gaining traction in enterprises.
  1. Significant headroom for more enterprise workloads to move to the cloud. An interesting data point – 68% of enterprise respondents says that less than 20% of their applications are currently running in the cloud.
  • It will be interesting to see how his number changes over time. Reversing the statistic – 80% of enterprise applications are still run on premise.  This could be due to IT organizations heavy investment in capitalized equipment / data center.  It could be that the economics of a public cloud are still too expensive to justify moving to a public cloud.  There could be technical limitations such as security which are holding back cloud adoption.  Finally, there could be organizational prejudices against taking what is perceived as a risk to embrace the public cloud.  Very likely it is all of the above.
  • The role of a visionary CTO is to move their organization forward to embrace new technologies, break down prejudices, and find new and better ways to serve customers. Cloud vendors are working to make it easier for organizations of all sizes to adopt the cloud by lowering cost, increasing security, and providing new features which make management more seamless.
  • While this study does not provide any data on the breakdown of PaaS vs. IaaS it is a reasonable assumption that most enterprise adoption of the cloud is IaaS as this is by and large simply re-hosting an application as-is. PaaS applications on the other hand typically need more integration which in many cases involves software development.  Once done, however, PaaS applications are often more secure, scalable, and extensible as they take advantage of the hosting platform infrastructure.

Cloud Challenges 2015 vs. 2014

Finally, RightScale has a proprietary maturity model which ranks organizations comfort level with using cloud related technologies.  Interestingly the data suggests that nearly 50% of organizations have yet to do any significant work with the cloud.  This data can certainly be expected to change over the next 2-3 years.

Cloud Maturity of Respondents


Predictions 2015: CIOs Accelerate The Business Technology Agenda

February 21, 2015

At the end of 2014 Forrester published a short research note entitled “Predictions 2015: CIOs Accelerate The Business Technology Agenda.”  (The report is $499 if you get if from Forrester but Microsoft has apparently licensed the content to make it free.)  Some of the provocative ideas in the note include:

Many CIOs have the technical expertise and cross-functional business purview to help drive digital innovation, but they are too often still seen as the leader of a cost center.

This is the truth.   Not that it is much fun for anyone but for those of us in IT budget season is a constant battle for funding.  Many times the best outcome is level funding.  Generating revenue on our own would change the game.

Both consumer and business customers increasingly expect real-time access to connected product and service information. These expectations not only define customer engagements but also ripple throughout the supply chain — shortening product cycles, requiring more agile operational capabilities, and creating opportunities for new, disruptive digital services.

Even some of the smallest businesses – appliance dealers, car services, and even dentists offer on-line access to inventory, scheduling, and account information.  Larger companies that offer primitive or worse no access to this type of information absolutely stand out as laggards. Increasingly not only is the expectation that information will be on-line and accessible organizations without a credible mobile strategy are at a competitive disadvantage.

  • Prediction No. 1: CIOs Accelerate The Business Technology Agenda

There are a ton of buzz words buried in this predication including agile development, connected products, mobile services, and customer-focused governance models.  The reality that many of us live with every day is doing more with less.  Technology is lever to these new challenges.  For example, automation of tasks previously only done by humans can lead to dramatic cost and time savings.  Better, cleaner execution, using new development techniques (e.g., DevOps, automated testing) make it faster and more efficient for development teams to get code in the hands of customers.  Cross platform development is now realistic using tools like Xamarin.  Finally, highly reliable consumption based cloud delivery platforms are available from multiple different providers.  While the challenges have never been more formidable so too the tools at our disposal have never been more powerful.

  • Prediction No. 2: CIOs Unlock Data-Driven Business Opportunities

According to CSC there will be more than 35 ZB of data generated in the year 2020 – a 4300% growth from 2009.  It’s hard to visualize that much data.  Much of that information will come from the emergence of sensor data (think connected refrigerator) that has previously never been on-line or accessible.  While there will no doubt be new unstructured data such as this, traditional web and mobile applications (Facebook, Instagram, and Twitter) will continue to evolve and grow.  The dramatic growth of information calls for new solutions to make sense of it all.  Again there is cause for real optimism that technology is keeping pace; There are powerful NoSQL / NewSQL databases such as Cassandra, analysis engines such as Hadoop, and visualization platforms such as Tableau.  Many of these technologies are open source and most of the commercial solutions are reasonably priced – often tied to consumption.  One of the primary challenges is the imperative to get technology organizations thinking differently.  For example, it is fully possible to use Cassandra like a relational database but in so doing much of the value of the solution would be lost.

  • Prediction No. 3: CIOs Make CDOs Unnecessary

The notion of a Chief Digital Officer (CDO) is perhaps a role that exists in larger organizations.  Someone, whether it be a CDO or the CIO, needs to truly own the responsibility for protecting the organization’s customer data.  High-profile security breaches such what happened to Target, Sony, the US Army, and many others have made it apparent that this huge issue.  Organizations are under near constant attack from outsiders that at best want to put an offensive message on the home page and at worst want to hold it hostage for ransom.  Protection of data starts with penetration testing, applying best of breed solutions (FireEye, Palo Alto Networks), and aggressively patching vulnerabilities as they are found.  This all will maintain the status quo.  Forrester rightfully points out that the best organizations will find ways where the Chief Marketing Officer (CMO) will be tightly aligned with the CIO to leverage technology to move the business forward.

Become leaders of digital change – or be usurped. Somebody has to be in charge of increasingly connected and dependent technology for the enterprise.  Fast-cycle, tech-based innovation drives a coherent, cross-channel digital experience is crucial to succeeding in today’s markets. …  Are all CIOs up for the challenge? No.  But in 2015, any CIO who isn’t will be replaced by one who is.

As Zig Ziglar said “Success occurs when opportunity meets preparation.”


Thoughts on ORM Tools

January 14, 2015

The following is a summary of an email thread discussing Object Relation Mapping (ORM) Tools.  In my experience developers hold strong opinions about ORM Tools.  In a past life my organization used LLBLGen and the folks that were most informed on ORM tools had strong opinions that it was much better than both nHibernate and Entity Framework.   As a conversation starter I provided two articles from December of 2013 and follow up from February 2014 comparing the various ORM / Data access frameworks.  I wanted to see where my organization stood on the topic of ORM.

As expected there were strong opinions.  I found that there were essentially two camps – believers and non-believers. Interestingly the group (of about 10 very well informed senior folks) were evenly split on their opinions as to whether ORM is worth the effort or not.  Also very interesting was that there was little disagreement about the pros and cons of ORM.

Believers

The “Believers” are proponents of Microsoft’s Entity Framework.  I am apparently the only one to have ever used LLBLGen.   Somewhat surprisingly no one in the group had any significant experience with nHiberate.  Some had some passing experience with micro ORMs Dapper and Peta Pocco.  Believers say that the savings achieved by having a clean, very flexible data access layer code is worth the investment in the overhead in maintaining the ORM.  Their argument is that investment in tuning the ORM is smaller than the productivity gains achieved from its usage.

Non-believers

This group believes that the overhead associated with maintaining an ORM tool does not justify the return on the investment.  They believe that stored procedures connected to the database using custom data access layer code written in ADO.NET are best.  Some have built code templates to help generate and maintain their Data Access Layer.  This believe this really helps us on our efficiency while keeping full control on the code/execution.

Pros and Cons

There was broad consensus around the pros and cons of ORM – again based on experience with Entity Framework version 5 and 6.

Pros Cons
Relatively straight-forward. It has good default conventions and rules. Hard to fine tune EF (e.g. query optimization). In half cases it ends up writing SQL manually and executing it from EF context.
Functional – it implements 3 approaches (code- model- database- first), inheritance support, eager and lazy loading. Not very good for complex models. SQL queries become very large (could be up to several pages) and hard to understand.
Flexible. It’s possible to change conventions and rules; select only needed relations. Slow to fetch large datasets (thousands of rows).
Not suitable for batch operations (insert, update, delete)

Net net

There are a range of problems where ORM would be a good solution and others where it would not.  Small, relatively non-transactional applications seem to be a good fit.  As the volume of data grows the value gap narrows to well-done hand crafted SQL.  The tradeoff is obviously the cost of having simple changes take more time to implement and test than with something like EF.

ORM seemingly can be made to work for most applications – the question is at what cost.  Hand coding SQL might not make sense for an application with hundreds of database tables.  On the other hand ORM might not make sense for a highly transactional database.   In the end my sense is that this comes down to people and your architect’s preference.  The choice of an ORM is like choosing a platform – .Net MVC or Ruby on Rails, SQL Server or MySQL, Linux or Windows.  While there are some people out there who can easily move between platforms in my experience developers have preferences and comfort zones.  The choice of whether to use and ORM Tool and if so which platform to use is both an application and a personal decision.

References

https://www.devbridge.com/articles/entity-framework-6-vs-nhibernate-4/

http://stackoverflow.com/questions/2891905/should-i-use-entity-framework-instead-of-raw-ado-net

http://weblogs.asp.net/fbouma/fetch-performance-of-various-net-orm-data-access-frameworks-part-2

http://weblogs.asp.net/fbouma/fetch-performance-of-various-net-orm-data-access-frameworks


Cloud computing is as much a mindset as it is a technology strategy

August 21, 2014

An analogy I’ve used in the past is that were Marc Benioff (CEO of Salesforce.com – arguably the most forward leaning organization in the world of enterprise cloud computing) to acquire a business built around a collection of “stove pipe,” on-premise solutions one of the first things he would do is dictate that everything must to be moved to the cloud. Marc Benioff was one of the first CEOs to truly “get it.”  Everything his organization does is viewed from the lens of the cloud.

Increasingly, it is clear that there is a viable and economical cloud computing solution for nearly every situation.

800px-UNIVAC-I-BRL61-0977

Organization Size SecurityReq. Existing Investment IT Group
Entrepreneurs / Sole Proprietorships Basic $ None
Startups Varies $ Developer
Small Business (less than $1M in annual sales) Varies $ Shared
Small-to-medium Business ($1-$10M in sales) Varies $$ Maybe
Mid-sized Companies ($10M-$100M in sales) Medium $$$ Likely
Medium-to-Large Companies ($10M-$100M in sales) High $$$$ Yes
Enterprises ($1B+ in sales) High $$$$$ Definitely

* Columns are meant to represent typical situations

To some there are as many reasons not to embrace the cloud as there are to adopt it.  It is a matter of perspective.  Those positively disposed to a SaaS strategy will find a way.  Those who are not will find reasons (excuses) why it does not make sense, is too costly, or would be too much of a distraction.  Where there is a will there is a way, however, the size of the organization (which is often a proxy for existing investment) reasonably weighs on the level of difficulty of making the leap to the cloud.

Existing investment – For smaller entities or individuals there may be cheaper alternatives than hosting in the cloud, however, the effort associated with maintaining hardware and the ease of scaling are reasons why a cloud-based solution makes sense over the long term.  For larger entities and enterprises where there is an existing investment in other technologies such as a datacenter that cannot be easily abandoned, a hybrid hosting strategy often makes sense.

Security – A common reason cited not to use a cloud based solution is that customers demand a higher level of security than can be achieved in a public cloud.  Organizations like Salesforce.com and Google (Apps) have demonstrated for all the world to see that even the most sensitive information can be held securely in a public cloud.  Arguably, when properly configured, information in a public cloud is as well if not more securely held than in an on-premise solution.  Other than high security government data there is little information which cannot be “trusted to the cloud.”

Cost – The cost profile of hosting an application in the cloud is indeed different than if hosted locally.  There are less upfront costs and expenses are tied to usage.  For larger entities there is a material difference in how the costs are accounted. Cloud costs are operating expenses that need to be incurred in the current reporting period where much of on-premise costs can be capitalized and depreciated over a number of years.

Complexity – Hosting in a public cloud poses new challenges to individuals and IT organizations which may have become accustomed to owning their own hardware.  The concept of deploying to the cloud, ensuring proper security, scaling, and hundreds of other tasks are new skills that IT organization have to acquire.  Those that do will thrive.  Those that do not will be left behind.

head-in-the-sand

FUD – Fear, Uncertainty, and Doubt is an expression that has been used in the computer industry since the mid-1970s to refer to a technique used by those who seek to stymie the adoption of new technology due to a lack of understanding.  One of the first applications of FUD was by IBM sales people to undermine the confidence of buyers in their competition.  For those with a stake in the status quo moving to the cloud represents at best more work and a worst increased cost, down time, and frustration as they figure their way through new technology.

In reality Salesforce.com would likely never acquire a business so wedded to what they would view as an antiquated technology strategy.  When viewed in this way the degree to which an organization has embraced the cloud can be viewed as a competitive advantage or indeed a disadvantage.


Enterprise Adoption of Cloud Technology

July 29, 2014

Forrester recently published a research note on enterprise adoption of Cloud Technology.  The full report can be downloaded here from Akamai.com (after registration).  As the report was commissioned by Akamai who absolutely is not a neutral third party the results need to be considered with caution.  That said, there are some interesting conclusions.

  • Public cloud use is increasing across a number of business-critical use cases.

This is not a surprise.  Public clouds have become mainstream.  Amazon’s case study page is a who’s who of well-known traditional brand names including Hess, Suncorp, Dole, and Pfizer as well as newer technology oriented companies such as Netflix, Shazam, Airnb, and Expedia.

  • Cloud success comes from mastering “The Uneven Handshake.”

The gist of this point is that organizations have specific requirements (e.g., security, access to behind the firewall data, etc.) which may be incompletely fulfilled by a particular cloud offering.  In order to use a cloud solution it may be necessary to piece together multiple provider solutions together with custom “glue” code.

  • It’s a hybrid world

Most organizations that have been around for a while have an investment in on premise systems.  In addition to providing valuable services that work (think don’t fix what isn’t broken), they are known commodities, and are typically capitalized pieces of equipment/software.  In a perfect world oftentimes it would be cleaner to create a homogeneous configuration all on a cloud platform.  Unfortunately we do not live in a perfect world and many times cloud systems have to be made to co-exist with legacy systems for technical, cost, or other reasons.

One particularly interesting finding is that most enterprises are quite satisfied with their investment in the cloud.  This conclusion is illustrated in the following figure.

How well did your chosen service actually meet key metrics?

Enterprise Considerations

As organizations begin the journey to or expand their operations in the cloud there are a number of important considerations.  Each of these topics stands on their own and literally thousands of pages of documentation exist on each.  Here are some brief overview thoughts.

  • Platform as a Service (PaaS) or Infrastructure as a Service (IaaS)

In a PaaS configuration the provider manages the infrastructure, scalability, and everything other than the application software.  In an IaaS configuration the enterprise who licenses the software has total control of the platform.  There are pros and cons to both PaaS and IaaS.  PaaS can be very appropriate for small organizations who wish to off-load as much of the hosting burden as possible.  PaaS platforms offer organizations less control and less flexibility.  IaaS provides organizations as much control as they would have in a self-hosted model.  The trade off with IaaS is that the organization is responsible for the provisioning and maintenance of all aspects of the infrastructure.  Enterprises new to the cloud may find that there IT group is most comfortable with IaaS as it is much more familiar territory.  As the IT group is the one who answers the panicked call at 2:00 AM there conservative nature can be understood.

  • Picking the right provider

Google AppEngine, Salesforce.com, Heroku, and Amazon Elastic Beanstalk are some on the most well-known PaaS platforms.  Amazon’s EC2 platform as well as Microsoft Azure Virtual Machines are the two dominant platforms in the IaaS space.  (Azure has a rich PaaS offering called Web Sites.)  Rackspace also has very strong offerings as well – particularly in the IaaS space.

  • Platform lock in

With an IaaS model careful consideration should be given to the selection of technology components.  To a point made in the Forrester report interfaces between existing components need to be considered and configured to work together.  Further consideration should be given to whether platform specific technologies should be used or not.  For example, Amazon offers a proprietary queuing solution (SQS – Simple Queue Service).  RabbitMQ is a well-respected open source queuing platform.  The choice of SQS would lock an organization into Amazon where the choice of RabbitMQ allows more flexibility to shift to another platform.  Again these are trade offs to be considered.

  • Security

With enough time and effort public cloud technology can theoretically be made as secure as an on premise solution.  This topic is considered by the Forrester report.  They note “The most common breaches that have occurred in the cloud have not been the fault of the cloud vendors but errors made by the customer.”  Should an organization make the decision to hold sensitive business-critical information in the cloud a best practice would be to retain a subject matter expert in cloud security and conduct regular third-party penetration testing.

  • Global Footprint and Responsiveness

One of the advantages of working with a public cloud provider is that an organization can cost-effectively host their applications around the world.  For example, Amazon offers three hosting options in the Asia Pacific Zone alone and nine regions world-wide.  Hosting in another geography is on the surface attractive for improving response times for customers as well as complying with country specific privacy regulations.  For most organizations hosting in a shared public cloud is much cheaper than self-hosting in a remote geography.  Organizations should be aware that hosting in a given region may or may improve response times depending on how their customers access the service.  Your mileage may vary depending on customer network routing algorithms.  Performance testing using a service like Compuware can help identify how your customers access your content.  Similarly, care needs to be taken to ensure compliance with privacy laws.  For example, it is a well-known requirement that PII data from EU citizens should not leave Europe without the user’s consent.  A public cloud can be used to comply with this directive, however, should administrators from the US have the ability to extract data from that machine the organization may not be meeting the requirements of the law.

  • Uptime and monitoring

Finally, enterprises need to be concerned with up-time.  It is a law of nature that all systems go down.  Even the biggest, most well maintained systems, have unplanned outages.  Nearly every cloud systems has a distributed architecture such that rarely does the entire network go down at the same time.  Organizations should carefully consider (and test) how they monitor their cloud hosted systems and fail-over should an outage occur just as they do with on premise solutions.  Should an organization embrace a hybrid hosting strategy the cloud could fail over to the self-hosted platform and vice versa.


Agile Pre-mortem Retrospectives

June 6, 2014

Failure is Your Friend is the title of the June 4, 2014 Freakonomics Podcast.  The podcast interviews cognitive psychologist Gary Klein.  Klein talks about an interesting technique called the pre-mortem.  “With a pre-mortem you try to think about everything that might go wrong before it goes wrong.”  As I was listening to Klein talk about how this might work in the physical world and medical procedures it occurred to me that this might be a nice compliment to an agile software development project.

Most scrum teams do some type of post-mortem after each sprint.  Most of the literature today calls these activities retrospectives which has a more positive connotation.  (Taken literally post mortem means occurring after death in Latin.) After training exercises the Army conducts after action reviews, affectionately called “AARs.”  For informal AARs (formal AARs have a proscribed format that is expected to be followed) I always found three questions elicited the most participation – what went well, what did not go well, and what could have been done better.  This same format is often effective in sprint retrospectives.

A pre-mortem retrospective would follow a very different format.  It asks the participants to fast forward in time after the release and assume that the project was a failure.  Klein’s suggestion is to take two minutes ask each participant to privately compile a list of why the project failed.  He then surveys the group and compiles a consolidated list of why the project failed.  Finally, after compiling the master list he would ask everyone in the room to think up one thing that they could do to help the project.  Ideally the team is more attuned to what could go wrong and willing to engage in risk management.

In concept the idea makes a ton of sense.  I can see how it would force the team to be honest with themselves about risks, temper over confidence, and ultimately be more proactive.  On the other hand a pre-mortem is one more meeting and one more activity that is not directly contributing to the project.  I question if there is enough value to do a pre-mortem on every sprint, however, for major new initiatives it could be a useful activity.  I quickly found two references on this topic.

http://www.slideshare.net/mgaewsj/pre-mortem-retrospectives

http://inevitablyagile.wordpress.com/2011/03/02/pre-mortem-exercise/


Using the right database tool

April 27, 2014

Robert Haas, a major contributor and committer on the PostgreSQL project, recently wrote a provocative post entitled “Why the Clock is Ticking for MongoDB.”  He was actually responding to a post by the CEO of Mongo DB “Why the clock’s ticking for relational databases.”  I am no database expert, however, it occurs to me that relational databases are not going anywhere AND NoSQL databases absolutely have a place in modern world.  (I do not believe Haas was implying this was not the case.)  It is a matter of using the right tool to solve the business problem.

As Haas indicates RDBMS solutions are great for many problems such as query and analysis where ACID (Atomic, Consistent, Isolated, and Durable) are important considerations.  When the size of the data, need for global scale, and translation volume grows (think Twitter, Gmail, Flicker) NoSQL (read not-only-SQL) solutions make a ton of sense.

Kristof Kovacs’ comparison has the most complete comparison of the various NoSQL solutions.  Mongo seems to be the most popular document database, Cassandra for Row/Column data, and Couchbase for caching.  Quoting Kovacs – “That being said, relational databases will always be the best for the stuff that has relations.”  To that end there is no shortage of RDBMS solutions from the world’s largest software vendors (Oracle – 12c, Microsoft – SQL Server, IBM – db2) as well many other open source solutions such as SQL Lite, MySQL, and PostgreSQL.

In the spirit of being complete, Hadoop is not a database per se – though HBase is an implementation of Hadoop as a database.  Hadoop is a technology meant for crunching large amounts of data in a distributed manner typically using batch jobs and the map-reduce design pattern. It can be used with many NoSQL database such as Cassandra.


Follow

Get every new post delivered to your Inbox.

Join 94 other followers