The topic of Drupal’s backward compatibility issues has come up in various ways over the years and has been an issue of debate in most cases. When we responded to Dries’s “State of Drupal 2011” survey, only about 8.2% of the community members who responded indicated that improving backward compatibility was one of Drupal’s “biggest challenges”. But this is actually a pretty large number of respondents, considering that there were 19 other options for the question, and we could only select three, and also considering that the majority of those who took the survey might not represent the majority of Drupal stakeholders who would most benefit from improved backward compatibility: people who might not work with Drupal full-time, and might not personally maintain any code, and who want to upgrade their sites to the next major versions, but cannot do so easily because of missing modules and the fact that, regardless of whether a module might be very close to compliant with the next major-version API, modules are only for one major Drupal version.
If this lack of API backward compatibility were common with Windows or Mac applications, or Firefox plugins, or just about any other software written for a platform, far fewer people would make these upgrades or they would wait a very long time to do so (and might end up changing operating systems or browsers to avoid the headache). The opposite is actually true in the case of Mac apps; Apple has historically supported legacy code and legacy processor types with compatibility bridges which, although not particularly great for performance, at least allowed needed software to run. Firefox and Safari, my two most-used browsers, regularly release major updates which don't break every plugin (at least with Firefox, some add-ons are identified as incompatible and disabled until a compatible version is released). And PHP deprecates functions, but usually doesn’t make it impossible to use code that worked in the previous major PHP release. But Drupal introduces major changes that result in developers having to learn a whole new API just to get their modules or themes ready to run again. This made sense in the past for a variety of reasons, at least back when Drupal was still in diapers, but there might be sufficient reason, now, to revisit the issue and explore whether this approach still makes sense in the foreseeable future.
Lest it sound like I think this approach was always a bad idea, I should say that I do understand the logic behind breaking compatibility, at least historically. Too much attention to backward compatibility and it limits innovation, more bugs creep in, and the code starts to bloat. There are even former Drupal developers who claim they now work on slimmer platforms in order to avoid what they consider too much bloat in Drupal, already. It’s doubtful that Drupal would be as popular as it is today if the core team had focused on maintaining backward compatibility in the API just to keep from breaking legacy code. But what worked in the early days of Drupal may not work so well when there are thousands of modules which are not considered high enough priority to port to the next major Drupal release and most larger Drupal sites seem to have at least one of those that they consider a necessity.
Bar chart from Dries’s summary of the State of Drupal 2011 survey results:
For the purposes of this post, I’ve marked up the image to discuss a few points. Yes, there are “only” 269 respondents who chose backward compatibility as one of Drupal’s biggest challenges. But it is also arguable that this challenge is directly related to many of the other challenges which ranked higher.
Consider the following survey options for “biggest challenges” as they relate to the issue of backward compatibility:
- Determining what modules to use (24.3%): When the modules you used on Project X aren’t available for months, or even a year after the release of Drupal N, you might be back to square one; looking for modules that can do the same job and trying to determine a migration path for your data.
- Release cycle length (6.3%): When developers have to practically re-write many modules every time a new Drupal major release comes along, this puts a damper on the community enthusiasm to see new major versions of core. Shorter release cycles would be welcomed by the community if modules were not tied to a major Drupal version.
- Usability and ease of use (28.1%): I think it goes without saying that needing to upgrade every module along with the Drupal core is a barrier to “ease of use”, at least for site admins, especially if not every module is available for Drupal N
- Rate of Drupal 7 adoption (16.1%): Clearly if Drupal 6 modules could still run in Drupal 7, without major work involved on the part of individual site maintainers, there would be more sites being migrated to Drupal 7. But instead, many developers who might have written a module for Drupal 6, due to the requirements of a particular project, and who were kind enough to contribute their work to the community, have reduced incentive to port it to Drupal 7 if other modules they need for their own sites aren’t ready yet. So there is a knock-on effect as each developer waits for another’s work to reach a particular state of completion.
- Losing the low-end/grassroots market (7.2%): This is another side-effect of the complexity involved in maintaining Drupal-based sites.
- Increasing complexity of core (19.1%): This is definitely a factor affecting porting modules to Drupal
- Developers think that they really need to completely redesign for “fields in core”, entities, and all the other great stuff that came with Drupal 7, but they are so busy working on Drupal 6 projects that they don’t have time to stop long enough to work on Drupal 7 releases of their projects. And since they want to “do it right” they delay until they are sure they know how to write a module that takes advantage of the improvements.
All together, these total 101%, in addition to the 8.2% who actually marked the “backward compatibility” option. Clearly there is some overlap, but the majority of the Drupal community would likely find significant benefit from any improvement in API compatibility which helped smooth the upgrade path between major releases of Drupal. Note: Supporting API backward compatibility would, of course, be likely to have at least some significant negative impact on Performance and Scalability (22.5%), so any approach we take will need to be a balancing act.
Of course the great Git migration, as fantastic and long-overdue as it was, is probably another obstacle currently delaying many smaller modules, ones which might be maintained by part-timers who now have to learn a new version control system as well as a new API. (This is ameliorated by the influx of developers who might have been put off by CVS and are now happily contributing with Git, possibly writing better versions of the modules that haven’t been ported, but possibly not interested in developing a migration path from similar modules that have not been ported.)
It’s a very different landscape from what the community faced when Drupal 4.7 was just released:
[…] After more than a year of development we are ready to release Drupal 4.7.0 to the world. More than five years, 13 major releases, 30+ servicing firms employing 100+ Drupal professionals, 300+ third party modules, and over 55,000+ Drupal powered sites later, Drupal 4.7.0 is finally here and it rocks!—Dries Buytaert, Drupal 4.7.0 released
When Drupal 7 was released at the beginning of this year, there were hundreds of thousands of Drupal 6 sites and several thousand modules for Drupal 6. Most of the statistics Dries quoted in 2006 have increased by an order of magnitude or more, as have the complications of upgrading to more complex APIs. I think it is likely that whenever Drupal 8 is released, we will also need to re-think the position about not supporting more than two major versions as it is likely that there will still be hundreds of thousands of Drupal 6 sites, a lot more than there were Drupal 4.7 sites when Drupal 6 was released, or Drupal 5 sites when Drupal 7 was released. The number of new sites running on Drupal 7 is steadily increasing, but the number of Drupal 6 sites has stayed fairly flat.
Even back in 2006, when Dries stated his position about ignoring backward compatibility to avoid bloat and performance issues, he did acknowledge that the time might come when this approach would no longer be acceptable:
[…] It seems inevitable that sooner than later, we will have to be a lot more careful about breaking peoples’ code. And when that happens, I fear that this will be the end of Drupal as we have come to know it. […]
How alienating is it that modules are tied to versions of Drupal and block major version upgrades?
A lot of people, if they find they have to go through a painful migration process to get from one version of software to the next might start considering alternatives to the software or at least find that it dampens their enthusiasm to continue innovation. I suspect, also, that developers who spend some time writing a simple module, one which does a simple task and does it well, may also not feel motivated to go back and think about how to re-write that module if they already have other projects going. Necessity is the mother of invention, they say, but when the necessity is being pushed on you by arbitrary decisions to change an API, it could really be demotivating. Working on last year’s module is not what you want to be doing on Christmas Eve. Yes, it’s a compromise to hold onto imperfect code in core, but as long as the API can be maintained without any security issues, it should be, so that contrib modules are not specific to just one major version of Drupal. This will certainly be a welcome change for most of the community at this point.
What can we do… and when?
Ideally, modules which work for one version of Drupal should work in the next, without change, even if they don’t take advantage of “new core features”, just like many of the OS X apps I used to use years ago are still functional after five years, even if they no longer have all the “bells and whistles” I might have come to expect ina “modern” app and there be be some glitches here and there. But I think it’s likely the historical trend toward no backward compatibility will continue when Drupal 8 is released (D7 modules will not work in a D8 installation unless significant work is taken to ensure that old paradigms still get results); I don’t know for sure, but it seems likely that many very fundamental changes (e.g. the use of Symfony™ 2 components in core) are coming in Drupal 8 and it could be difficult to maintain much backward compatibility in the API without performance-crushing “compatibility layer” modules needing to do too much work. On the other hand, if we expect applications like Drush to sort out the URL changes between Drupal 7 and Drupal 8 (the new directory structure), i.e. remain Drupal-version-agnostic, we should be able to incorporate a lot of that same logic in Drupal, itself. If a path makes sense based on the old structure, revise the path to match the pattern used in the new directory structure.
Personally, I do hope that steps can be taken to remedy this issue in the future. Not all modules will be compatible from one version to the next, but it would be ideal if there were a way to keep as much backward compatibility (in the API) as possible from one major version to the next so that upgrading a site doesn’t mean every module needs to have been maintained and released for the new version of Drupal. This functionality could be provided by a group of API “bridge” modules which could be disabled once all modules are running non-deprecated APIs, but would otherwise allow running a previous version’s modules, even if at a loss in performance. The cost of queries and server processes has dropped a lot since 2006, along with the price of hardware, and it’s likely to continue that drop. We get much more bang for the buck than we used to. And there are also promising technologies which could offer better performance to Drupal, such as HipHop compilation of PHP code and new database technologies which can already be used with Drupal. If we have a nice cart for our “baggage” it might be considerably less problematic for the community than “waiting for the next flight”. I’m curious to see how this plays out and hope that “the end of Drupal as we know it” could be a good thing (maybe we’ve already seen that, since Drupal 4.7 was certainly a very different beast from D7 or what we expect to have in D8. I’d be curious to hear your thoughts on the matter. Have we reached a point where the performance concerns are outweighed by the bulk of contrib code that needs to be ready in order to migrate a typical site? What approaches might work best? Could it be done in Drupal 8 or do we need to wait for a future “Drupal N”?