Skip to content

Into the Future

January 3, 2018

2017 is history and 2018 is officially underway, and that means it’s time for our annual look into the future and what the new year will bring for software development.  As most of you know, Evans Data is the top resource for market intelligence on software developers.  This month marks our 20th anniversary of focusing exclusively on software developers and continually surveying them to find out what technologies they’re adopting, what their issues and goals are and what they think is going to happen in the future of software.  We also have tremendous insight from our clients.  Each day we talk with them and we find out what they are interested in knowing and what technologies they are exploring and planning to develop.  So, we’re in a pretty good position to hoist a crystal ball and foretell what the big technology news will be in 2018.

BitCoin and BlockChain

Well, you don’t need to have much inside information to know that BitCoin has taken the investing world by storm.  During 2017 the price of the digital currency surged more than 1,900 percent and at one point almost touched $20K. But as stock traders will point out, there’s definitely a bubble when even people who have no idea what an asset is are talking about it and trying to buy in.  BitCoin is now what is called “highly frothy” and we, like most financial observers, think that 2018 will see that bubble pop.  BlockChain, on the other hand, is here to stay and will continue to quietly grow as we go into the year.  The basic architecture of blockchain creates a distributed ledger system which makes for secure transactions which can always be verified and traced (though in some cases with difficulty).  Thus it is very useful in the financial world.  But we think it will also see wide acceptance in other areas such as healthcare, energy, defense and more.

Artificial intelligence and Machine Learning

While BitCoin took the spotlight, the real surge in tech adoption came from the adoption of Machine Learning and Artificial Intelligence.  This was spurred by an intense interest as well as a flood of new tools and libraries such as Tensorflow, Caffe, Watson and others that help developers easily incorporate machine learning into their apps.  By the end of 2017, more than 6.5M developers worldwide (29% of the world’s developer population) were integrating some machine learning techniques into their apps, while another 5.8M were planning to. While the largest group is within the APAC region, North American adoption is also hot.  Expect this to grow.

Augmented Reality

It’s easy enough to imagine augmented reality when it comes to games, as Pokemon Go so amply demonstrated, but that’s just a tiny fraction of what the technology can be used for.  We’re predicting a lot more implementations in the healthcare field, where doctors can now “look inside” a patient, or how about the remote specialist essentially projecting his hands into the display of a surgeon on site wearing Google Glass.  In manufacturing the uses are limited only by one’s imagination.  For example, service technicians and manufacturers can walk up to any item that has IoT technology and immediately have access to the object’s specs, inventory, location, and lead times.  In education, AR can replace outmoded and awkward learning tools.  The power and versatility of AR means its use is only limited by our imaginations and we foresee lots more AR use cases in the coming year.

Quantum Computing

This is the most nascent of the hot technologies for 2018, but one of the most far-reaching.  Simplistically, quantum computing veers away from our established bit based, on-off electrical circuit computing by allowing for superpositioning in which a state may be on or off or a combination of both.  Based on principles from quantum mechanics, this new technology is being pioneered by the likes of Google, IBM and Intel and while it may not command a lot of attention in 2018, the foundation that is created this year will be of utmost importance in the years to come.

AI – Threat or Not? Software Developers’ Take

July 28, 2017

Well of course AI is a hot topic in software development and in the software community in general.  Developers everywhere are finding ways to incorporate one or more of the varied forms of AI including machine learning, deep learning, pattern recognition, conversational systems, real-time data analysis and more. AI adoption is everywhere.  But you really know it’s a hot topic when software Silicon Valley titans like Elon Musk and Mark Zuckerberg get into a public tiff about how AI is going to impact us.

Telling the future is an imprecise art and developers are not prescient; however, as practitioners, they may have some insight. In June we published the next edition of our ongoing Evans Data AI and Big Data Survey – a survey of developers actively working in these areas – and asked them not only what AI projects they’re working on now but how their projects are going to impact people and their jobs.

The majority of their current projects (51%) are seeking to replace IT activities. This was also the category most developers (47%) were seeking to supplement.  That’s right, traditional IT tasks are most directly in the bull’s eye, and that’s likely to be the first community to see the impact. Is that for better or worse?

It’s clear that artificial intelligence is going to displace workers. There’s been lots of talk about robots replacing unskilled workers – burger flippers and such – and this is already happening, but AI and robotics aren’t the same thing, and the jobs that AI will replace or supplement are jobs that require a lot of cognition. It’s logical that it would start within the computing arena itself, and there seems to be a lot of excitement about it in that area.

But it spreads out from there. Customer support was the second most targeted category, selected by 41% of developers, while operational decision making was the second most selected job category for supplementation. These are areas that most of us would agree could use some help.  The industries that are being addressed most frequently are, once again, the computer or software engineering industry, followed by telecommunications and manufacturing.

Obviously AI will change a lot of things, starting with the software industry itself, but that doesn’t call for doom and gloom.  When asked what the most exciting things they saw AI bringing, the developers said “Integration of machine learning with IoT” – the marriage of two new technology implementations that both hold significant promise, also “Real-time transactional databases”, which is hardly threatening, and “Predictive analytics”.  Of course it’s possible these could “turn bad”, but it’s most likely that these types of implementations will instead usher in new efficiencies and open up areas and opportunities that haven’t even been thought of before.

Throughout history people have been afraid of the future and of the technologies that shape that future, but so far those innovations have been a boon to mankind.  Developers are smart and they are passionately embracing AI, so if we had to make a call on whether AI developers would side with Elon or Mark, my bet would be that Mark would come out ahead.

The Changing Face of Software Developers

April 20, 2017

We all change as time goes by, but when we’re talking demographics sometimes those changes aren’t always what we might expect.  Take software developers, for example.  Developers are for the most part getting younger.

The median age of today’s software developers has decreased overall on a worldwide basis according our recently released Developer Marketing 2017 survey report. This report shows the median age of developers by region and trends in the four major regions going back ten years.

The overall age decline is largely due to an increase in younger Latin American developers, where the median age is now 35 as well as to the APAC region where the median age is 34. Developers in the EMEA region, typically the oldest developers have also gotten younger with a median age of 40 – down from 42 last year. The median age of developers in North America remained steady at 39.

However, the real change that has been occurring is the rise in number of women developers. That number has quadrupled in the last ten years, with sharp increases over the last few years. Today, more than a quarter of all developers are women when we look at the worldwide population. The largest area of growth for the number of women developers is occurring in the APAC region, and in emerging regions such as APAC and Latin America that growth is being driven by young women entering the profession.  In APAC the percent of developers under 30 years of age who are female is almost 40%.  North America has the smallest percentage of female developers of all regions, with just less than 18%, but that’s a lot of growth compared to ten years ago..

Many major technology vendors have gone out of their way to try to get women involved in STEM activities, and organization devoted to getting women into technology like “Girls Who Code” have been making significant progress towards evening out the gender basis in technology.  Looks like it’s working!

Top Developments for 2017

January 5, 2017

It’s that time of year again when we take out our crystal balls and predict the trends that will power and dominate software development in the coming year. In fact, most predictions that are worth their salt are actually projections based on trends that we’ve noticed in the past. So last year and the coming year are inextricably connected, and so are our 2017 predictions.

First and foremost, artificial intelligence in all its forms will continue to be the hottest topic in software development and will go from a curiosity to ubiquity as more and more developers incorporate AI libraries and functions into their apps. At the end of 2016 42% of developers said they were using some forms of cognitive computing or artificial intelligence in their development projects, up dramatically from the number of practitioners at the beginning of the year. Within the overall arc of AI, we see particular movement in the adoption of conversation systems (chatbots) plus machine learning in all types of apps, but especially those involved with Internet of Things. Deep learning is also an area of heavy interest that will grow over the next year but at a slower pace than other AI-related disciplines.

Next is the further penetration of the connected world into everything we do and everything we use. Internet of Things development became an even stronger force with over a third of developers either currently working on IoT projects or having worked on one in the past as of the end of 2016. The strongest IoT segment being targeted flipped from the first part of the year and is now business to consumer, although industrial and commercial implementations are still strong. The types of projects that are being worked on are diverse and no one type is dominating the area of focus for IoT reflecting the relative newness of the market. We see stronger presence in industry as well as commerce going forward.

Block chain development is another area that we believe will blossom during 2017. Already used in some form by almost one in five developers, this form of distributed database will become prevalent as developers marry it up with IoT to address security as well as ease of use. Over 60% of developers who use blockchain use it for a purpose other than cryptocurrency, and we expect the types of use cases that will proliferate during 2017 will be diverse and maybe sometimes surprising.

New user interfaces will round out the profile of our 2017 predictions. Virtual Reality will be more and more popular for gaming, but it’s Augmented Reality where we’re going to see explosive growth as the possible implementations transverse industries. Today 29% of developers are incorporating some type of AR into their applications, but an additional quarter plan to.

So 2017 looks to be full of exciting and significant developments. Hoping it’s a happy one for all of you.

The Future of Computing

November 10, 2016

The Churchill Club is the premier Silicon Valley organization designed to promote the exchange of ideas and networking amongst those in the industry and to “ignite” conversations. It describes itself as “one part business, one part technology and 7000 parts people”. Last night it hosted Ginni Rometty, CEO of IBM, at its annual dinner and she provided information and insights that truly ignited excitement and a whole realm of thoughts for the future of computing.

It’s “Digital Today, but Cognitive Tomorrow” she said as she described IBM’s vision of augmenting human intelligence with computer cognition. Having just been at the Innovation Hanger at World of Watson the previous week, she mentioned several of the innovative solutions the developers were working on there with machine learning and Watson. IBM provides tools and libraries to help developers implement machine learning and other forms of artificial intelligence in their applications. And they’re not alone. Microsoft’s Machine Learning Studio, HPE’s Haven on Demand, Google’s Deep Mind are just a few of the tools that major vendors are providing for developers to add intelligence to their development. Our most recent Global Development Survey showed that almost 40% of developers worldwide are already working with some form of cognitive computing, with the APAC region leading the way. Cognitive computing in all its varied forms is without doubt the hottest technology trend amongst developers today.

Blockchain development was another important area that Ginni talked about. She predicted that blockchain “…will do for trusted transactions what the internet did for the transfer of information”. IBM has upped the security aspect of blockchain development by providing special security features in its blockchain product for developers on Bluemix with confidential transactions, monitors and more. Development of the distributed blockchain architecture is a current activity for just over 18% of developers worldwide, with another 29% planning for it in the future. And it’s not just about BitCoin. Over two-thirds of those using or planning to use blockchain will use it outside of cryptocurrency. The most popular implementations are for an information hub or the management of IoT devices.

The last subject was cyber security, and with that she brought it all together. Security is essential and becoming more so but it’s also becoming harder to implement. “As soon as you get somewhere in development you can assume the bad guys are already there”. So that’s where Cognitive Computing again comes in. With Watson for Cyber security, IBM is seeking to answer the old question of how to protect technological progress while ensuring that it continues to be secure going into the future.

Ginni was impressive with her quick thoughts and insights and IBM was impressive as well as it blazes a trail into the most important new worlds in the future of computing.

Cloud Formations

October 20, 2016

The number of developers actively developing in a Cloud environment reached 5.4 million in 2016, according to our most recent Global Development Population and Demographics Study, the de facto standard in developer population estimates. This represents an increase of 375% since 2009.

The study, conducted twice a year since 2006, uses a plethora of publicly available data points and a sophisticated computer model to produce developer population estimates per country and region for almost 40 countries which then have primary research data overlaid to show numbers of developers adopting various technologies around the world.  In addition, ten years of continuous history produce highly reliable multiple regressions to use to project into the future.

So we have seen very reliable estimates of the increase in numbers of developers using Cloud as a development platform.  But in addition, new estimates for developers involved in Big Data and Advanced Analytics show the worldwide number is up to 6 million, while the biggest percentage gains, with a 34% increase in just one year, belong to the 2 million developers who are now targeting IoT systems.

If we consider these huge growth areas, it becomes clear that there is a convergence here and an inter-reliance that knits it all together.  IoT is enabled by Cloud.  Without a Cloud infrastructure the data that sensors produce, the information that is collected and acted upon would be constrained to a local environment and restricted by limited scalability.

Big Data is the form that data in an interconnected world often takes.  Videos, aerial photos, pressure sensors in addition to numerical and text formats all have to be ingested and processed, and advanced analytics, often with the help of machine learning, must be employed in order to analyze and ultimately make sense of the whole picture.

Couple all this with the rise of platforms from virtually every major company in the world and we can begin to see a new world in which most everything works together.

Clouds provide the enablement and even the creation of the future.  Their shapes growth shows our progress and their shapes show our direction.

Machine Learning and Developer Programs

September 29, 2016

It looks like machine learning has taken the developer world by storm.  Among those developers who are working on projects that involve Big Data today, over a third (35%) are using some form of machine learning in their applications.  And while the finance and insurance industry shares the top spot on the list of targeted industries for machine learning with IoT, they each only account for 13.4% of the implementations.  This shows a highly fragmented market, where no one industry dominates in the types of applications that are being imagined and created by developers for machine learning solutions.


In response to the huge interest in this form of artificial intelligence, major manufacturers are racing with each other to provide tools and APIs to facilitate ML on their platforms.  IBM has long been offering Watson APIs on their Blue Mix platform, while Microsoft has an entire Cortana development suite on Azure.  Amazon provides ML APIs for AWS. HP has Haven on Demand.  The list goes on and on.


But other than supplying the tools and APIs for developers to use, how else can ML benefit developer programs?


It turns out that there is actually a very rich set of capabilities that can be added to a developer program through implementing machine learning.  Just a few thoughts come to mind.  How about using ML to sort developer inquiries in an intelligent way to spot common themes or problems with products or tools?  Or, maybe when a developer accesses an API, use ML to suggest other appropriate APIs or tools?  ML can be used to track a developer’s interests and movements within the developer program portal and then anticipate his needs and offer suggestions for additional documentation, training or tools based on his past behavior.  And, of course, chatbots can be used to supplement tech support and training – maybe even one day replace the need for humans in those functions.


When we measured the complementary technologies being used, real time event processing was cited as a factor in 30.8% of ML applications. Image recognition and description (the ability to spot faces or specific things) was a factor in 28.9% of organizations’ use cases, and pattern recognition (the ability to see the same thing again) accounted for 28.3%. Video processing was cited in better than one use case in four (25.6%), suggesting strong applications for surveillance and physical security.


Those are use cases for the general population, but with just a touch of imagination, these can all be folded into a developer program to provide new and exciting offerings to support developers and enhance their experience with your program.