Acquia: Drupal 8 All-in: Acquia is ready for your D8 project now!

Planet Drupal - Thu, 2015/08/13 - 3:11pm
Language Undefined

Part 2 of 2 - In a recent conversation with Tom Erickson, Acquia's CEO, we got to talking about Acquia's "Drupal 8 All-In", what it is and what it means for the Drupal world and those we have yet to convince ... Acquia has drawn a line in the sand, saying, this thing is so great, we are confident we can deliver the kind of help and experience that we've been guaranteeing with Drupal over the last 7 or 8 years. This thing is ready enough. And all of the rough edges that we find along the way; this is the perfect opportunity to sand them off. This thing, Drupal 8, is ready for real business. In short: Acquia is running customer sites on Drupal 8 already, supporting all of Drupal 8, helping get the last rough edges off it for general release, and would love to talk with you about your next project and whether Drupal 8 is a good fit!


Web Wash: Create Custom Visibility Rules in Panels Using Ctools Access Plugins

Planet Drupal - Thu, 2015/08/13 - 1:06pm

Panels comes with a great feature where you can control the visibility of individual panel panes. Visibility rules are useful when you need to show or hide a pane based off some criteria. You can add a rule by clicking on the cogwheel on the pane and then click on "Add new rule" within the Visibility rules section.

The default options are fine for simple configuration. But sometimes you’ll need to write a bit of code to implement complex requirements. To handle this functionality Panels utilises the Ctools access plugin. So if you need to build custom visibility rules then just write your own access plugin.

Today I’ll show you how to create a basic access plugin for those times when the default options won’t cut it.


Steve Purkiss: Dropping in on the Brighton Homebrew Website Club

Planet Drupal - Thu, 2015/08/13 - 12:19pm
Thursday, 13th August 2015Dropping in on the Brighton Homebrew Website Club

Last night I attended the first meetup of the Brighton outpost of the Homebrew Website Club - I had planned on staying in and blogging about my recent Hungarian adventures at Drupalaton but very glad I popped my trusty Respect Your Freedoms-certified Libreboot X200 laptop into my lovely new Drupalaton tote bag & made the short but always eventful trip across the Laine/Lanes border from purkisshq to 68 Middle Street and made progress on upgrading my website to Drupal 8, here's why:

  • focused time meant I actually did what always seems to get put back in the queue, and owning my own data and code is important to me in the long term
  • time-boxing an hour meant I had no choice other than to prioritise, for me that was fixing an issue I had with getting DrupalVM running.
  • collaboration apart from attending the business day at our DrupalCamp Brighton event I haven't been to 68 Middle Street much since I wrote a blog post about Atomised Web Design where I kinda rip into one of the space's founders & organiser of the Brighton Homebrew Website Club Jeremy Keith (@adactio). Four-or-so years on, and thanks in big part to the stern work of MortenDK, Drupal 8 is a whole load better so I don't feel so bad for my not-so-humbledness of old, plus I couldn't miss an event on a topic I'm so passionate about - Freedom!
Show & Tell

Not realising the openness of the space I may have been my usual loud self when chatting to a fellow fashionably late attendee as we walked around the corner of a bar to the main venue space where there were lots of people sitting round tables with one up showing what they'd been working on. There was a wide range of people and interests, with most discussions being around what software and services people were using to control their internet communications.

It came to my turn - I didn't want to even attempt to connect my laptop up to the projector as I didn't really have that much to show so just waved my latest blog post about What is Drupal? around on the laptop and explained that ever since I attended Léonie Watson's DrupalCamp Bristol Keynote: The metamorphosis of accessibility I have been deeply affected by how far behind we still are in terms of providing any sort of acceptable web experience for everyone, and as I started blogging again I wanted to make sure my creative efforts would be accessible and enjoyable. I wrote a description of the photo in my blog post but noticed the correct tags weren't there for accessibility - WAI-ARIA, ARIA Live Announcements API and TabManager are built-in to Drupal 8.

Other people were using a wide range of tools and languages to build their sites, it will be interesting to see how they achieve some of the things I'm going to be doing with developing over the next few months to explain more about what I do, in terms of delivering Drupal projects by working with the community as well as other parts of how I approach my life experience.


Once updates were all done it was time to get down to business. Fellow Drupaler & DrupalCamp Bristol organiser Oliver Davies had mentioned the other day that I just needed to change nfs to rsync in the settings.yml file and that worked. I should've seen that but wasn't thinking - NFS is Windows, I'm using Linux ;) So by changing just one setting I had a whole virtual machine with a fresh, working install of Drupal 8 up and running on my 7-year old technology but freedom-respecting laptop - so much for Drupal 8 is complicated enterprise-only!

Next was reading up on a production-ready version of the virtual machine which I could host on DigitalOcean. Once I learn more about the sysadmin side of things and feel more confident, I'd prefer to host it from home as there don't seem to be any suitable distributed options I know about / could use which fully respect mine (or yours) freedoms - surely there's business to be made there?! I'm also looking at a more home-grown solution from the UK - VLAD, but for the moment I'm going with what I know is working for me and once I'm a little more confident and can inspect a working system I'll see where I can go from there.

I'd just started to get into the production-ready issue when the preverbial bell rang and we were out of time! I made my way up to the Brighton Farm weekly new media freelancers meetup where, amongst many other interesting conversations, I discovered the Cubieboard, which according to the FSF is all good for Freedom apart from the WiFi controller, but as I don't plan on hosting wirelessly this sounds like a good option to investigate for self-hosting.

I'm looking forward to the next Homebrew Website Club meeting in a couple of weeks, certainly managed to achieve movement and am looking forward to getting off the Drupal 7 island into the wide open waters of Drupal 8!

tags: Homebrew Website ClubDrupal 8Drupal PlanetPlanet Drupalbrighton

OSTraining: Using the Bootstrap Theme with Drupal

Planet Drupal - Thu, 2015/08/13 - 2:48am

Bootstrap is winning the web.

Nearly 10% of all websites now use the Bootstrap framework.

That's reflected on, where Bootstrap is the third most popular theme. Bootstrap is a base theme that integrates Bootstrap 3 with Drupal.

Here's a guide to getting started with the Bootstrap theme.


OSTraining: Love it or Hate it, Bootstrap is Winning the Web

Planet Drupal - Thu, 2015/08/13 - 12:14am

Back in August 2013, we wrote a post called, "The Bootstrap Boom is Just Getting Started".

At that time, we estimated that Bootstrap powered between 1.5% and 3% of the web.

2 years later, I decided to check in on Bootstrap. How is popular is the framework now? Did the Bootstrap boom continue, or has it bust?


Isovera Ideas & Insights: Building Backbone.js Applications from a Drupal Perspective - Level 2

Planet Drupal - Wed, 2015/08/12 - 10:07pm
In this level, we will find that Backbone Views can be used very much like Drupal Blocks to display components across a site.

Drupal Association News: What’s new on - July 2015

Planet Drupal - Wed, 2015/08/12 - 9:30pm

Look for links to our Strategic Roadmap highlighting how this work falls into our priorities set by the Drupal Association Board and Working Groups.

July was an action packed month at the Drupal Association - we had our quarterly prioritization with the Working Groups, our annual all-hands summer staff meeting, and mentored two tremendously dedicated interns throughout.

Our primary engineering focus was on DrupalCI and - though we also found time to make some iterative changes to in a few areas, namely: issue credit refinements, performance, groundwork for the new content model.

Strategic Planning Prioritization for Q3 2015

The Working Groups help to provide governance for and to set priorities for the work of Association staff. Each quarter we evaluate our priorities with the Working Groups and update our Roadmap.

On July 15th we updated our roadmap based on the Working Groups input. Our main priorities for Q3 are services that are required to support the Drupal 8 release, and functional improvements to

  1. The port of to Drupal 7, as well as a few issues that support Drupal 8 localization.
  2. Making sure DrupalCI meets the MVP spec set out by the core developers for providing test coverage for Drupal 8, and that it meets the functionality required to replace the old testbot system.
  3. Improving search.
  4. A new documentation section based on our content strategy work that will provide better organization and governance of documentation.

Additional priorities were identified for Association Staff to tackle as time permits.

Drupal Association Summer Staff Week

July was also the time for the annual all-hands staff meeting for For one week, we gathered all our local and remote staff in our Portland office to discuss:

  • The mission, vision, and values of the Association.
  • Our ever-evolving relationship with the Drupal project itself.
  • Setting engineering and design principles for the team.
  • Finding sustainable revenue that will fund our work.
Internships at the Association

For 8 weeks beginning in mid-June the Association staff hosted and mentored two interns who had just completed Epicodus’ inaugural Drupal curriculum.

Bojana Skarich(BabaYaga64) and Daniel Toader(CountPacMan) worked with us on bug fixes, features, and theme work for the Conference Organizing Distribution and several related modules that allow the Association to run our DrupalCon websites.

We’d also like to thank our Supporting Partner ThinkShout for funding Daniel and Bojana’s work with us. This is just one small example of how the supporting partner program fosters our mission by promoting Drupal as part of a software development training curriculum and giving these new members of our community a great head start. Issue Credit Updates

We deployed two small updates to continue to refine and iterate on the Issue credit system that we implemented in the beginning of this year.

Firstly, to support an earlier change to allow explicit attribution as a volunteer, we’ve updated the attribution :hover state display. Previously unattributed comments and volunteer attributed comments would both simply display the username in attribution, though the distinction was being made in the comments themselves. Now that distinction exists not just in the data but in the display on comments.

Since releasing the issue credit system in March, there have been over 9,500 issue credits awarded on over 5,200 issues. Over 2,400 unique users and 250 unique organizations have been awarded issue credits. Over 1,000 projects (modules, themes, distributions) have credits that have been awarded. The last 90 days of issue credits can be viewed on each user and organization profile.

Secondly, we deployed a small change that will automatically generate the first comment on a newly created issue.

This automatically generated initial comment serves two purposes: It allows the original author of an issue to be credited when the issue is resolved, even if they did not leave any subsequent comments on an issue. It provides a link to the original issue summary providing a better at-a-glance view of what the original reporter wrote, even if the summary has since been edited a large number of times by other participants in the issue.

There are still additional refinements to be made as we find time - in particular providing a ui to edit the attribution that will be made for the automatically generated first comment.

Entityreference_prepopulate Module

The new content model for requires a number of new modules on To ensure that the site remains performant we are serializing these changes as much as we can. The first new module to be deployed on was entityreference_prepopulate.

As we work to build out the Documentation section we’ll be installing additional modules, creating some new content types, and providing a number of new resources for maintaining documentation on the site.

Advanced Aggregator

Improving performance of is an ongoing concern, particularly as we look to adding new modules that while powerful may also be somewhat heavy on a site of our scale. Utilizing advanced css/js aggregation is something we began to gradually implement towards the end of June, and in July we completed the majority of the changes laid out in this issue.

With these changes we’ve largely completed the work that will be done here for the foreseeable future, though there may be a few more performance gains to be found here and there. Thanks again to mikeytown2 for his assistance.

Drupal 8 Blockers DrupalCI

July was a huge month for DrupalCI. There are two major milestones for the Association’s work on DrupalCI.

  1. DrupalCI must meet the testing requirements for Drupal 8 Core and Contrib specified by core developers.
  2. DrupalCI must also meet or exceed the existing functionality of the PIFT/PIFR testbots for testing Drupal 7 and Drupal 6 so that the old testbot system can be retired.

The first milestone was our primary goal in July - while the second will be our hard focus in August.

We made tremendous strides towards the first goal, starting with a reformat of the test result output to better display in Jenkins. This new format more logically organizes the test output by:

Test Group -> Test Class -> Test Method -> Output/Result


This should make understanding the results of testing easier in the long run, and is also a precursor to displaying test result information directly on - which we hope to complete in August.

We also made improvements to the test history pages - so that project maintainers can make better comparisons of any given test result to the status of a branch when an issue was created, for example, or against the most recent branch test. These test history pages also allow maintainers to see which user triggered the test, and are the portal to the test results.

July also saw the deployment of patch level testing with DrupalCI - which can be enabled on a per environment basis for projects on

Towards the end of July we also enabled testing for Contrib projects - allowing any project maintainer on to begin using DrupalCI. We are asking project maintainers to enable DrupalCI for their projects and provide us with their feedback in this issue. This will be critical for us to retire the old testing infrastructure.

We also focused on improving the performance and efficiency of the tests. Minimizing the time it takes to initiate a test and complete a full test run both improves efficiency for developers and maximizes the reach of the Association budget for automated testing.

The new DrupalCI architecture automatically scales up and down the number of bots dependent on need, which will hopefully present a cost savings once we disable the redundant old testing infrastructure.

In addition to the architectural work above - we also upgraded our base environments to php 5.5 to support the change in Drupal 8 minimum requirements.

Finally, we improved the documentation for project maintainers for enabling DrupalCI testing on their projects.

Endless thanks to jthorson for his help.

After our community testing of the Drupal 7 port in June we identified a critical path of remaining issues that needed to be resolved to allow us to complete the upgrade. Many of the issues were related to user roles and permissions do to the differences between Drupal 6 organic groups and the Drupal 7 version.

We put a hard focus on resolving as many of these issues in July as we could, so that we would be ready to perform the final upgrade in August (which was completed successfully on August 12th).

Much thanks to Gábor Hojtsy and Sébastien Corbin for helping us with this process.

Revenue-related projects (funding our work) DrupalCons saw a few small deployments in July - primarily to support the launch of the Session Schedule, BOF Schedule, and Social Event calendar.

We also added the ability for event attendees to purchase or renew their Drupal Association memberships while purchasing their tickets for DrupalCon.

Finally we are in the planning phase for some additional work to support our payment processing needs in India, and to support having the registration process live for multiple events simultaneously.

Sustaining Support and Maintenance New Git Infrastructure Deployment

As mentioned in our June update - we put the bow on a long-standing project to migrate our git infrastructure to new servers in July. Much of the work to provision the new servers was completed in June - but the cutover itself was scheduled in the early weeks of July.

The new git infrastructure is now both redundant and highly available, greatly increasing the stability of a critical part of our infrastructure.

Serving Files from a Separate Domain

In July we also acquired a new domain and wildcard cert for * This new domain will be used to serve static files across the Drupal ecosystem of websites, providing benefits for security and reducing the size of http requests by serving these resources from a domain without cookies.

Work to serve files from the new domain name is ongoing, and many thanks to mlhess for helping us implement this change.

Load balancer stability

After continuing the debug the decreasing stability of our load balancers, we decided to swap hardware and rebuild the load balancers using different hardware. The new hardware is also using an updated configuration and operating system, which has proven to be more stable. The second load balancer is in the process of being built out using the new configuration and different hardware. The project should be completed by the end of August, bringing stability back to one of our key infrastructure components.

Many thanks to nnewton for helping us diagnose and make this change.

Updates to Updates

One of the core services that provides is In essence this is a feature of Drupal itself. Because is the home of updates information for the entire project, we analyze our updates traffic as part of our project usage statistics.

Unfortunately the project usage stats have been somewhat unreliable - so in July (and continuing into August) we’ve given this system some attention.

Changes we’ve made or are in the process of making include:

  • We moved the updates system to a CDN (and then migrating to a different CDN provider).
  • We updated our processing to work with centralized logs on our loghost.
  • We are improving the performance of the process from a 3-4 hour run per day’s worth of data to a 1 hour run per month of data.
  • We are simplifying the process by removing an intermediate MongoDB deduplication/key-value store.

Work to improve the performance and stability of the updates stats system will be ongoing.

As always, we’d like to say thanks to all volunteers who are working with us and to the Drupal Association Supporters, who made it possible for us to work on these projects.

Follow us on Twitter for regular updates: @drupal_org, @drupal_infra.

Personal blog tags: whats new on

Mediacurrent: Cleaner and semantic markup with Drupal’s views

Planet Drupal - Wed, 2015/08/12 - 7:53pm

A few months ago I wrote about why good markup matters. As a front-end developer, I interact with Drupal’s markup on a daily basis and I experience first-hand the benefits of good markup and the frustration of poor markup. A lot of the content on a Drupal website is produced by views. Whether they are out of the box or custom views, they are powerful and present us with great opportunities to manipulate the content and markup.


Zivtech: Goodbye Ruby, Hello Node.js: Speeding up Sass

Planet Drupal - Wed, 2015/08/12 - 6:30pm
Recently at Zivtech we started migrating our base theme for Drupal called Bearskin over to a gulp-based CSS compiling system. Up until now we had been using the original Sass, written in Ruby, but lately it's been feeling a bit slow.

The sluggishness of Ruby's Sass really started showing with very large projects, and became a nuissance when coupled with LiveReload. Let's face it, when you make a single CSS change, waiting for even a couple seconds can feel like an eternity. 

The first step to speed things up was to get rid of some mixins that we no longer needed. The low-hanging fruit was Compass. Compass is an amazing framework that we had come to rely on, but now that browsers have largely caught up with each other, there isn't been much need for Compass's vendor prefixing mixins. Instead, when we do need it we can use an Autoprefixer or write our own mixin. Easy enough!

After getting Compass and some other Ruby gems out of the way we saw some improvement, but not enough to make us happy! So the search continued...

We decided as a group to move over to Gulp and try to get rid of Ruby altogether.

Why Gulp? Gulp is all node-based, so we were able to use gulp-sass for compiling our CSS. That gulp plugin is just a thin-wrapper around libsass, a C/C++ implementation that is wicked fast, even for large projects. 

Some quick Google searches lead me to benchmarks performed by The Opinionated Programmer that compare several CSS preprocessors. Long story short, libsass is about 25 times faster than Ruby on a first run, and after Ruby has a sass-cache available libsass is still about 12 times faster. That's a massive improvement! Terms: Related Services Development High-quality robust websites and web applications are built for change to grow with your organization. Read more Product Development TCa.aS, short for Technical Cofounder as a Service, is a new type of software service relationship for product-focused startups and existing businesses looking to start a new venture or product line.  Read more

Advomatic: Decoupling Drupal Without Losing Your Head — Part 2

Planet Drupal - Wed, 2015/08/12 - 6:08pm
From Styleguide to Final Product In the first article in our series we talked about how styleguide-driven development (SDD) and decoupling both serve many of the same goals, and SDD can do so in a way that avoids the many challenges of decoupling.  Now we’ll get deeper into the details of SDD and how we... Read more »

Drupal Watchdog: VIDEO: DrupalCon Los Angeles Interview: Narayan Newton

Planet Drupal - Wed, 2015/08/12 - 4:38pm

Narayan Newton (Partner, Tag1 Consulting), takes a break from spring-cleaning his infrastructures to recall a once-upon-a-time Drupal takeover of a hotel kitchen, and the “semi-gourmet” meal that resulted.
He also divulges the nature of his continuous integration setup. (Jeeves. Or maybe Jason. Jenkins? I don’t know, something with a “J.”)
Finally, he admits that “I vote for President based on hair, so, yes, I’d definitely vote for Dries.”
BTW, this would be an excellent time for you to subscribe to Drupal Watchdog:

Tags:  Video DrupalCon LA Drupal Video: 

LevelTen Interactive: How to use Views RSS to create a Mailchimp RSS email campaign

Planet Drupal - Wed, 2015/08/12 - 4:18pm

Mailchimp provides a great tool for automatically sending out emails that pull information from RSS feeds (read more from Mailchimp on this topic at Create an RSS-Driven Campaign with the Campaign Builder). However, when I first tried out the Mailchimp RSS campaign builder with Drupal's out-of-the-box RSS feed, there were some things that didn't work well.... Read more


Four Kitchens: Use Grunt and AdvAgg to inline critical CSS on a Drupal 7 theme

Planet Drupal - Wed, 2015/08/12 - 10:34am

Inlining critical CSS on a dynamic CMS such as Drupal or Wordpress doesn’t have to be a pain. Using the Fourword as an example, we’ll go through the process of generating critical CSS, inlining the file dynamically, and asynchronously loading the remaining CSS aggregates using the AdvAgg module.


Foster Interactive: - Tales from the Trenches, Building the First Multilingual Conference Site in Drupal 8 Beta

Planet Drupal - Wed, 2015/08/12 - 10:00am
Learn about our experience building in Drupal 8 Beta. We were very impressed with D8's new features, but also shed a few (Many) tears with some very frustrating challenges in the build. Read our first impressions on Twig, Multilingual Support, Beta-to-Beta migrations, & Configuration Management. I've also added few tips for diving into D8, and highlighted some challenges I think will make scare off some early adopters from learning D8.

Modules Unraveled: 144 Using the Open Sourced Red Test Framework Instead of Simpletest for Faster Drupal 7 Testing with Neerav Mehta - Modules Unraveled Podcast

Planet Drupal - Wed, 2015/08/12 - 7:00am
Published: Wed, 08/12/15Download this episodeAutomated Testing
  • Before we actually get started talking about Red Test, can we take a step back and talk about what automated testing is in general?
    • I am sure you are all aware of manual testing and QA. Once you build a site or a feature in the site, you go to the browser and test it out. As an example, you added a workflow to your blog post. You will go to the browser and test whether the blog you create gets the draft state. Then you will log in as an editor and you will move it to published state. Now you will log out and check whether the blog post is visible to anonymous users. What you are doing is manual testing. It works very well for smaller projects.
    • For bigger projects, and especially for projects with multiple developers, what starts happening is adding a feature may break an older feature. Let’s take the example above. Earlier all the blog posts used to be published by the writer directly as soon as he saved the form and an email used to go out to all your blog subscribers that a new blog has been created. Now after adding the functionality to save the blog post as a draft state, you want the email to go out only after the blog post has been published. Suppose the developer changed and tested the workflow functionality without changing the email sending rule and pushed the feature to live site. You will soon be flooded by complaints from your blog subscribers.
    • This is a very simple example but such problems increase exponentially when the software grows in size over time. Such problems can be prevented by having automation testing in place. For each feature in your software, you write test code and you run all these tests often, and especially after working on a feature. You push your new feature to production only after all the tests pass. This way, as long as you have ample test coverage, you minimize your chances of finding a bug on the production site.
  • Who generally does this testing, the developer? site-builder? site-owner?
    • There are different schools of thought here. In practice, I have seen two different approaches:
      • If you are following test-driven development, then the developer writes these automated tests. In fact, he writes the tests before he even writes the code for the feature and makes sure that the tests fail. Then he implements the feature and makes sure that all the tests pass. In our company, our developers write the tests but that’s usually done after he has implemented the feature. We are experimenting with test driven development as of now.
      • The other approach I have seen in having a separate QA team for testing. They write automated tests as well as do manual QA. The advantage of this approach is that you can hire QA person, who is usually much cheaper than a developer. The problem I have seen in this approach is miscommunication between the developer and tester. Because of the delay due to communication and also since tester writes the tests after developer has completed the task, as compared to in parallel, it takes more time for a feature to get out of the door.
  • What do these automated tests test for?
    • Ideally everything but practically that’s not possible. As everything in life, this follows pareto rule. 20% of tests will cover 80% of the functionality and you should definitely write these 20% tests. In addition to that, we follow a rule that no bug should appear on production twice. So if a bug is found on production, we write an automated test for it so that it doesn’t appear on production again. What these tests should cover is very dependent on your application. If you are building an information blog site with a beautiful theme and you are making changes mostly to the theme, then you should write regressions tests for your CSS and JS. On the other hand, if your site has a lot of custom business logic, access permissions and rules, then focus on testing the logic.
  • I always assumed tests were for functionality, can you give me an example of something you would test on the theme layer?
  • I have to admit that I haven’t ever written an automated test for any of the sites I’ve built. I did have an experience a little over a year ago where the videos on my site (that were supposed to be behind a pay-wall) weren’t because I made a change to panels that messed up the access control. I didn’t realize it for two months! So, if I had had tests in place to check the access to those videos, I would have been in better shape. So, even though my site is a small site in the grand scheme of things, even I can bennefit from writting appropriate tests.
  • Red Test, as I understand it, is an open source integration testing framework for Drupal that you developed at Red Crackle. Can you tell us briefly what it does and why we should use it?
    • Red Test is an open-source integration testing framework for testing your Drupal applications. If you have multiple developers working on a big Drupal application stretching into months, you know that you need automated testing to keep adding more and more functionality without breaking old code. Red Test is ideal for testing:
      • Site structure and configuration
      • Custom business logic of your application
      • User access, roles and permissions
      • Rules
  • Drupal 7 supports Simpletest by default. Simpletest has unit testing and functional testing. Why do we need another automated testing solution? How is Red Test different from Simpletest and why should people use it?
    • You correctly mentioned that Drupal 7 supports Simpletest by default. In real life, when you are working on a big project, there are quite a few challenges when you test Drupal sites.
      • Drupal assumes that there is a persistent database connection available so any hook or function is free to access the database at will. This means that any function that accesses the database, unit testing is not possible. You can obviously refactor your code but that takes a long time. Also since Drupal stores all the configuration in the database, most of your custom code will need to access the database anyway.
      • For every test, simpletest creates a new test environment that has only minimal modules enabled. So if you want to test your current site with all the configuration, you have to first set all that configuration in code and then run the tests. That is too much of an overhead.
      • Functional testing in Simpletest is very slow because it uses a headless browser and every request needs to bootstrap Drupal. It’s not uncommon for a test suite on a large site to take multiple hours to finish.
    • Red Test alleviates these problems. It is an integration testing framework so it tests your site with the current configuration. It actually depends on the database so there is no need to refactor your code to make it work with Red Test. In fact, Red Test code is totally separate from Drupal code. We have created Red Test so that it runs in parallel on multiple processors of your machine and bootstraps Drupal only once at the start so it is 60 times faster than Simpletest.
  • A lot of Drupal developers have started using Behat which helps in testing your site functionally through a browser. With Behat gaining traction, is there still a need for Red Test?
    • Behat is an excellent tool and in fact, we use it in addition to Red Test. Since Red Test is an integration testing framework written in PHP and resides on the server, it can’t really test the javascript. So wherever JS functionality needs to be tested, Behat is the right tool. On the other hand, for testing all the other business logic in the application, we use Red Test. If you have used Behat on a big project, you will know that creating and especially maintaining Behat tests takes a long time. Every time an HTML id changes, Behat test needs to be changed.
    • Similarly, when you add a new required field in a form, Behat test needs to be updated to fill that field in the form. Red Test, on the other hand, knows about Drupal and its entities, nodes, people, roles and permissions. So it does a lot of tasks automatically in the background. As an example, if you added a new required field to a node form, Red Test will automatically fill suitable values in that field without you having to change anything in your tests. This makes it very easy for you to create and develop tests on Red Test.
    • In fact, we have done some measurement over the last couple of months and found that with the latest version of Red Test, creating and developing automated tests takes about 12% of total development time.
  • Is it easy to get started with writing automated tests using Red Test?
    • Yes, we are using all the standard PHP tools so it’s pretty easy for a developer to get started. Red Test uses Composer for installation and is based on PHPUnit. In fact, we measured how much time it takes a newbie to create the first integration test using Red Test and it comes to be a little less than 15 minutes. Below is the blog post you can follow to get started:
Use Cases
  • What are some concrete examples of what you should test?
    • Red Test is ideal for testing:
      • Site structure and configuration
      • Custom business logic of your application
      • User access, roles and permissions
      • Rules
    • If you are building just a basic informational site on Drupal without much functionality, you may not want to use Red Test. Use it only if you are building a large Drupal application.
  • What’s in the future for Red Test? Do you have improvement plans, or plans to integrate with D8?
    • Currently a developer needs to create tests using Red Test. We are considering whether enhancing it so that it works using Gherkin language makes sense. The next obvious step is to migrate it to Drupal 8.
Episode Links: Neerav on drupal.orgRed Crackle on TwitterBlog postWrite a Red Test in under 15 minutesTesting blocks using Red TestTags: Automated Testingplanet-drupal

Realityloop: Automating image optimision with Drupal to improve performance

Planet Drupal - Wed, 2015/08/12 - 5:45am
12 Aug Brian Gilbert

Recently I blogged 6 tips for manually optimising images, now I’m going to show you how to do it automatically with Drupal!

Some time ago we undertook a performance audit for the website of a substantial local news service. One of the areas I found that we could increase the performance of their website was to compress their numerous images.

After identifying the commands to apply using the various optimising programs we found that the ImageAPI Optimize module got us some of the way there. We were then able to develop a patch to implement the extra binaries and later gain co-maintainer status, allowing us to improve the module further.

What is the ImageAPI Optimize module

The ImageAPI Optimize module allows you to preprocess and optimise uploaded JPG and PNG images that get generated via an Image Style rather than use Drupal’s inbuilt compression which only affects JPG files.

Despite the name, the D7 version of this module does not depend on ImageAPI. It depends only on the core image.module.

It allows the use of multiple binaries, some of which are lossy compressors, so note that you may want to only use some of the tools listed below.

      • Download and enable the required module
      • Configure appropriately
        Navigate to /admin/config/media/image-toolkit in your site and select the ImageAPI Optimize radio button, then click Save.

        Now you will need to set your ImageAPI Optimize Base Toolkit, you’ll if you have Imagemagick on your server you may want to select that instead of GD.

        Then you want to configure the ImageAPI Optimize Service settings.

        To find the path for each of the binaries you can run the following commands in terminal on your server, one at a time, and copy the output into the relevant fields, if there is no output then you can follow the instructions in step 3 to install the binary in question.

        which advpng which optipng which pngcrush which advpng which advdef which pngout which jpegoptim which jfifremove

        Then in the advanced settings for the jpeg tools I recommend you set the output to be a progressive jpeg. For more on why this is a good idea see the following links:

      • Set-up image compressors on your webserver & development computer if they weren’t on your server

        I will only provide instructions for Debian based Linux variants, and OS X (where homebrew is installed) here.

        advpng (lossless) which is part of the advancecomp package
            Linux: apt-get install advancecomp
            OS X: brew install advancecomp

        OptiPNG (lossless)
            Linux: apt-get install optipng (ensure you are using a version 0.7.5 or higher)
            OS X: brew install optipng

        Pngcrush (lossless)
            Linux: apt-get install pngcrush (ensure you are using a version 0.7.5 or higher)
            OS X: brew install pngcrush

        pngout (lossless)
            Linux: download binary archive and extract to /usr/local/bin
            OS X: download binary archive and extract to /usr/local/bin

        pngquant (lossy)
            Linux: apt-get install pngcrush (ensure you are using a version 0.7.5 or higher)
            OS X: brew install pngquant

        jpegoptim (lossless & lossy)
            Linux: apt-get install jpegoptim
            OS X: brew install jpegoptim

        jpegtran (lossless) which is part of jpeglib package
            Linux: apt-get install libjpeg-progs
            OS X: brew install libjpeg

        jfifremove (lossless)
        You need to compile it and then copy it into your path:

        wget gcc -o jfifremove jfifremove.c mv jfifremove /usr/local/bin
What is the real world effect?

This acts on any image that is displayed via an image preset. The original images are not affected. I can’t personally think of a case any time recently where we are displaying original uploaded images so for our usage this is perfectly acceptable.

Filesize saving in a really simple test of an iPhone6+ screenshot in PNG format showed a reduction in size of around a quarter.

What to do If you are using shared hosting..

Odds are that you won’t be able to install the tools above using the method shown. You’ll likely need to create statically linked binaries so that you can upload them to the server.

A statically linked binary is an executable that doesn’t require any support libraries as they are included in the binary itself. This results in a binary that will run on any system of the right CPU type (i386, x86_64) The downside of this is that the resulting binary ends up being larger.

Normally, when you download a tarball of source code, you’ll do the standard “configure; make; make install” to build it. If you want a statically linked binary, replace the plain “make” with:

    make SHARED=0 CC='gcc -static'

You will need to do this on a machine with the same architecture as your shared hosting.

Pro tip

It’s possibly worth using the ImageCache Actions module to automatically convert gif’s to png, because there isn’t really any good case for using gif’s now, especially with the above in place, unless your site expects animated gifs to be uploaded but if displayed via an image style these animations would become static anyway.

drupal planet

OSTraining: Use Rules to Automatically Update Drupal Content

Planet Drupal - Wed, 2015/08/12 - 2:32am

In my opinion, there are two modules that illustrate the power of Drupal better than any others.

The first module is Views, which most people become familiar with as soon as they learn Drupal. The other is Rules.

Rules is the basis for almost any workflow that's set up in Drupal. You probable need Rules, whether you're running an intranet, a social network, an e-commerce site, or any site that needs "First this, then that" commands.

In this tutorial, we're going to show you how to use Rules to automatically update a node. We're going to make a record of the past person to see this node:


Drupal Watchdog: The Tao of the Wayfinder

Planet Drupal - Tue, 2015/08/11 - 9:29pm

Everyone working on software has a baseline competency with communication, yet it’s not unusual to see the time required to communicate effectively viewed as a distraction from the real task at hand. People seeking to deliver valuable software in a more timely and reliable fashion tend to turn to concrete changes – a new ticket system or software tool, more developers, performance bonuses – rather than delve into developing communication skills, despite the many concrete ways improved communication can increase a project’s momentum.

Set and Communicate Goals for Response Time

You’ve probably experienced some form of the information black hole. Maybe you put in a request to have an account created but have no idea if it will be minutes or months until the request is fulfilled. It’s important, but not urgent, and you don’t want to be “that person” who demands immediate attention. If only you knew how long it should take, you could set aside the stress of not knowing. Then, if a response didn’t arrive when it should have, you’d know it was time to double-check.

Both individuals and teams can:

  • Set honest response times;
  • Communicate them clearly and visibly;
  • Monitor how they’re living up to those goals;
  • Adjust processes or goals accordingly.

People are free to focus on other things when they know how long they have to wait.

Setting such expectations also frees you from more “Is it ready yet?” e-mails. Sending an auto-reply or a cut-and-paste acknowledgement like this should do the trick:

“If your request hasn’t been answered in three working days, something has gone amiss. Poke us with a stick by e-mailing”

It can be as formal or as playful as suits your needs.


Palantir: D8FTW: Storing data in Drupal 8

Planet Drupal - Tue, 2015/08/11 - 9:10pm

Newsflash: Storing and retrieving data is rather the point of a Content Management System like Drupal. Of course, not all content is created equal. Some needs a robust structure and curatorial controls built around it, while other data isn't really content at all but administrator-defined configuration. The way those need to work can vary widely.

In Drupal 7, developers had three not all that great ways of storing data: Entities (usually nodes), the Variables table, and "here's an SQL connection, enjoy!" That doesn't cut it for modern sites, unfortunately. What's more, everything was stored in a single SQL database which is part of what made configuration staging so difficult in Drupal 7; we had to build complex systems to extract the configuration out of arbitrary SQL tables, serialize it, and put it back in.

Not surprisingly, Drupal 8 has largely fixed that problem by tackling the different types of data that may need to be stored, each with their own dedicated APIs. Of course, moving from one big blob of data (aka, arbitrary SQL) to structured APIs requires changing the way you think about arbitrary data. So let's review the different ways to store stuff in Drupal 8, and where each of them is useful.


The simplest option is the Drupal State API. The State API is a simple key/value pool for values that are, by design, specific to a single Drupal install. Good examples here include the timestamp of the last cron run, generated optimization lookup tables (which should not get cleared as often as the cache does), the currently active theme, and so on. These are all values that are not user-provided configuration, and would make no sense to deploy from staging to production or vice versa.

State can store values of any type, as they will be serialized and unserialized automatically. However, not all object types can be serialized. In particular, any object that has a dependency on a service should never be serialized. Only serialize value objects.

Note that every read of a state value is a new hit against the underlying database. If you're loading multiple values out of state for some reason, use the getMultiple() method.

The state API is a single namespace, so be sure to namespace your state entry key names with your module name, like "mymodule.last_person_hugged".


The State API is itself just an abstraction layer on top of the Key/Value API. The Key/Value API allows the storing of any arbitrary serializable value, with the keys namespaced to a "collection". The "state" is simply one collection.

It's also possible to use your own collection, directly accessing the Key/Value API. However, if you're going to do that it's helpful to define your own class that composes the Key/Value factory service, much the same way that the State API does. At the moment there aren't many tools to quickly replicate that functionality, but the process is straightforward and the State class itself is readily copy-pasteable. Most of the non-trivial code in it is simply to cache loaded values so that the Key/Value store is not hit multiple times per request for the same value.


Content in Drupal 8 means the Entity API. Drupal Entities are much more rigidly structured than in an ORM like Doctrine. There are three layers to the Entity API, conceptually:

  1. Entity Types define different business logic for different objects. What that logic is varies on the Entity Type. Generally a new Entity Type involves writing a new PHP class. Examples includes Nodes, Users, Taxonomy Terms, Comments, and Custom Blocks.
  2. Entity Bundles are different configurations of the same Entity Type, with a different Field configuration. Creating one involves setting up configuration, which in (nearly) all cases involves an administrator pushing buttons. "page nodes", "article nodes", and "event nodes" are examples of different bundles of the "node" Entity Type.
  3. Fields are the smallest basic unit of Drupal content. A field is a single rich-value. Rather than "string" or "int" it is a value like "email address", "formatted text", or "telephone number". It can also be a reference to another entity. All entity objects can be viewed as a collection of Fields, each of which may be single- or multi-value. (As far as the code is concerned Fields are always multi-value, but may be configured to only bother storing one value.)

The key aspect of Content is that it is generally user-generated and of potentially infinite cardinality. (That is, there's no limit on how many entity records a user can create.) Unlike in previous Drupal versions, however, the Entity API is robust enough that it is reasonable to use for nearly all user-provided data rather than just a select few types.

If you want users to be able to enter data into the system, and there's no hard-coded low limit to how many entries they can make, Entities are your tool of choice. Building a custom entity is also much more viable than in past versions, so don't be afraid to define your own Entity Types. There's no need to just piggy-back on nodes anymore.

Content Entities are also translatable into different languages. The capability is baked into all Field types, making multi-lingual content a breeze.


The most important new data system in Drupal 8, though, is the Configuration system. The Configuration system replaces the variables table, the features module, half of the ctools module suite, and the myriad custom tables that various modules defined in previous versions with a single, coherent, robust way to store, manage, and deploy administrator-provided configuration.

That last part is key. The Configuration system is your go-to tool if:
1) Users on the production site should not be changing these values. If they should be changing values on production, you probably meant for it to be Content.
2) If you have a staging site, you will typically be editing on the staging site and then deploying to production en masse.
3) Affects the business rules of the module or site.

For the Drupal 7 users, pretty much anything for which you ever thought "this should really be in a Feature module" now belongs in Configuration.

The configuration system is modeled as a namespaced key-value store (although it does not use the Key/Value system internally, as the implementations are quite different). The keys are dot-delimited strings, and the values are specifically "Configuration objects". Config objects have get() and set() methods to manage properties on the object, among other features we won't go into here.

Most importantly, config objects can be safely serialized to YAML and unserialized from YAML. That's what differentiates the Configuration system from the other data storage systems: It's canonical form is not in SQL, but YAML that can be loaded into the system or exported from it. Modules can provide default configuration files in YAML, which will get imported into the site when they're installed. A site can also export some or all of its configuration files to a known directory on disk. That could be hundreds of files, but that's fine. Once in that directory the files can be easily checked into Git, checked out on another server, and imported from files back into config objects in Drupal. Configuration deployment: Solved!

You will also run across something called "Configuration Entities". This seemingly mixed concept is a way of providing CRUD behavior using the basic Entity API but backed by the Configuration API. Configuration Entities do not have Fields, but otherwise use essentially the same API. Configuration Entities are useful for cases where a user or administrator may be making multiple instances of a given configuration object. They're also the storage mechanism underlying Plugins.

Configuration objects are also translatable, which allows sites to make string value configuration available in the language of their users.


Tempstore is a bit of an odd duck of Drupal 8's data storage world. It's provided not by core but but the user module, and there's actually not one but two different tempstores: one private, one shared.

A tempstore is for data that needs to be persisted between requests without being saved back to the canonical storage (such as an entity or configuration object). If that sounds like PHP's native session handling, it should; the use case is very similar. The main difference is the shared tempstore is, as the name implies, shared between users, whereas sessions are, by design, not.

The quintessential (and original) example of that behavior is Views. A View is stored as a configuration entity. You don't want the View to be incrementally updated every time a single field is changed, though; you want to make a series of changes and then save the changes all at once. Instead, a temporary copy of the View config entity is saved to the shared tempstore every time a setting is changed. That allows changes to survive a browser restart, or a lunch break, without affecting the live copy of the View. It can even be picked up by another user if the first user gets delayed or goes on vacation and forgets to hit save. When the View is saved then the temporary copy is written back to the configuration system and the temporary version cleared.

The private tempstore works the same way, but its values are not shared between users. That makes it more appropriate for wizard-type interfaces or multi-step forms.

Both tempstores are backed by the Key/Value API internally. The Key/Value API offers a variant called "expirable", where values will get cleared out eventually, say if a View is left mid-edit for several days, which tempstore uses. In practice, unless you're building complex multi-step UIs you won't run into tempstore very often.


And finally, we have the cache system. Drupal 8's cache system is actually far more robust than its predecessors, and is heavily leveraged by the rendering system. That's a topic for another time, though. For now, we're just looking at cases where you'll use it directly.

The general rule for caching something is "is it more expensive to compute this value than to look up an old version from the database?" Database calls are not cheap. (Even if not using SQL, you're still making some kind of I/O call which is the most expensive thing you can do in a program). Don't cache something in the cache system until and unless you know it's going to be helpful to do so. Often, that is other I/O intensive calls, like a web service call or a complex set of queries.

Another important rule for caching is that it should be for performance only. If the cache is wiped clean of all data, your code should still run. It may run slower, the site may run too slowly to be usable, but no irreplaceable data has been lost. Never, ever store data in the cache that you cannot recreate on-demand if needed. Similarly, don't store generated and regeneratable data elsewhere. That belongs in the cache.

What to choose?

With so many options, how do you know where to put your data? While the lines are not always crystal clear, the following pointers should cover most cases.

Is it purely a performance optimization, and the data can be regenerated if needed? If yes, Cache.

Should it be configured on staging and pushed to production? If yes, use the Configuration system. If there will be an arbitrary number of them, use Config Entities.

Was it in the variables table before, but not something to push from staging to production? If so, it likely belongs in State.

Is it user-generated content on the live site? Most likely it should be a Content Entity.


Shitiz Gag's Blog: [GSoC 2015: Hawk Authentication] Week 12: Unit testing and finishing the modules

Planet Drupal - Tue, 2015/08/11 - 8:03pm

GSoC is wrapping up in another two weeks, that means it’s time to start wrapping up the module and make it in a shippable state. For that, I have started working on unit tests as well as documentation.

Unit Tests

Unit tests help maintain a project in long term as they can help automatically detect if a change is made by someone in future doesn’t break some other part of the module, for example a change in the header library is to add some features that could result in breaking of an obscure client function that has been there since version 0.1. If the function has unit test coverage, it would get caught before the module gets released to the general public and would prevent some hassle.

My goal is to cover as many test cases as possible, accounting for general and exceptional cases. So far I have done basic Hawk Auth tests and nonce validation tests, I’ll be working on unit testing permissions, security and validation. I had to spend some time in order to figure out how Drupal does unit testing as that is something I’m not really familiar with and I haven’t done a lot of unit testing before so this was a good opportunity for me to pick up and implement some test cases to become more familiar with it.

A wall I ran into was when I was trying to use existing routes implemented by system module’s test submodule router_test in order to test whether Hawk Authentication was working. After a couple hours of tinkering it occurred to be that the route I was using was specifically mentioning basic_auth. After spending a few hours on this and realising that I was trying to get around something which was intentionally the way it is. I implemented my own custom module just for testing. The module provides a few basic routes in order to emulate controllers while the unit tests run. For example, a route simply shows the current logged in user’s username, this is to ensure Drupal is correctly identifying users via Hawk. As I implement more tests, I’ll implement more controllers to help them.


Since my blog has been accepted into Drupal Planet last week, I’ve started working on writing a few posts to recap the GSoC journey of mine as well as provide introductions to Hawk, Hawk Auth module and the project itself. Hopefully these articles will help people understand the protocol and the project better. I’m trying to get a few posts out this week and the next week.

For now that is all, I’ll continue with unit testing and documentation until the next week.

Thank you!