Subscribe to Planet Drupal feed - aggregated feeds in category Planet Drupal
Updated: 5 hours 31 min ago

Mediacurrent: Sample Content for Local Dev and Testing With Drupal 8 Core Migrate

Wed, 2020/06/17 - 1:02pm

Recently, the subject of sample content for use with local onboarding and testing in a continuous integration environment came up on a project. I went on a bit of a foray into to see what was out there.

The Current State of Demo Content

It seems there are two projects you might reach for to accomplish a sample content build right now (if I missed one, leave us a comment): Yaml Content, and Default Content (more on them at the end).

I think both of these projects miss the mark a bit. Both aspire to be a full content export/import system. For what I’m looking to accomplish, we don’t need to be able to export content, just make demo content for testing or local development.

Also, looking through the issue queues for both, I noticed some common goals:

  • Automatic delete of sample content
  • Not re-importing the same content on re-run
  • Or, re-importing the same content, but with updates
  • etc.

Soon, I found myself thinking:

These projects are both going to re-implement things Migrate already does.

So, I decided to see what I could do with core Migrate.

The Goals

I wanted to make a set of sample content that was simple, easy to maintain and extend, and that can be run locally or in a remote environment like Bitbucket Pipelines or your CI system du jour.

Testing and iterative development become much simpler if a baseline setup can be repeated quickly. Think running “ddev restore-snapshot” vs. importing a full mysql dump.

So this should be repeatable, should run fast, and ideally require little-to-no dependencies be added to the project. This is to avoid negatively impacting local and CI site install runtime. Again, a goal is for this to be very repeatable so it is useful for testing.

Outside of the core requirement of Migrate, there will be at least one other dependency to get this working. That being some way to run the migration imports. If your project is already using Migrate Tools and Migrate Plus you’re ready to go, and if not I recommend installing Migrate Run as a dev dependency.

Setting up an Import for a Basic Page

Looking for the benefits of few dependencies and a fast, repeatable runtime; what are our options? I wondered, “what is the simplest thing I could do to get a node on the page?” With just one core dependency and one contributed module we can import a node!

The one core dependency is, again, Migrate.

The first thing we need in order to do anything custom in Drupal is a module. The minimum requirement for making this module has two parts.

First, the info file:

name: Sample Content type: module description: Imports sample content using Migrate core_version_requirement: ^8 || ^9 package: Example Project dependencies: - file - link - migrate - path - paragraphs

File: sample/

This is the Initial module boilerplate: almost none. We will not need to create a sample.module file. This does include dependencies for all the field types used in upcoming examples, however.

Now, the second part: add a sample/migrations/ directory; the system will read Migrate plugins from here.

This is all that is required to import a sample basic page node using the core “embedded_data” source plugin:

id: sample_nodes_page label: Sample Nodes - Page migration_tags: sample source: plugin: embedded_data data_rows: - title: Sample Page 1 path: /sample_page_1 ids: title: type: string process: type: plugin: default_value default_value: page status: plugin: default_value default_value: 1 title: title 'path/pathauto': plugin: default_value default_value: 0 'path/alias': path destination: plugin: entity:node

File: migrations/

That’s all. If you just wanted to make a module that would import a single node with no field values, this will do that.

How Does This Work?  Do I Have to config-import This? 

No—if you’ve used Migrate Extras, you may be familiar with configuration import for those migrations. However, this is actually a Migrate plugin definition, not a config entity, so it will not be imported/exported with configuration management.

To update Migrate plugins after editing the file, a cache rebuild is all that is necessary.

Ok, so how do I run this?

Well, now you’ve got me. Remember earlier I mentioned that one contributed module we would need? Drupal core, at this time, provides no Drush commands for working with migrations. So, we need to make some or include a project to make them for us.

Migrate Run provides the drush commands we need. These commands were forked from Migrate Tools. Note: most projects, I find, will already require migrate_tools and migrate_plus (usually for ongoing data imports). If you have those installed, you can use the commands from migrate_tools instead, they are nearly identical.

Once you have migrate_run installed, enable your sample module (you would never forget to enable a custom module before trying to use it, right? I’ve never done this on every project ever…). Then, to check that your migration is being loaded, use the migrate:status command:

user@site-web:/var/www/html/web $ drush ms sample_nodes_page  ------------------- -------- ------- ---------- ------------- ---------------------   Migration ID        Status   Total   Imported   Unprocessed   Last Imported  ------------------- -------- ------- ---------- ------------- ---------------------   sample_nodes_page   Idle     1       0          1  ------------------- -------- ------- ---------- ------------- ---------------------

Then you can give the migrate:import command a spin:

user@site-web:/var/www/html/web$ drush mim sample_nodes_page  [notice] Processed 1 item (1 created, 0 updated, 0 failed, 0 ignored) in 0.4 seconds (468.1/min) - done with 'sample_nodes_page’ Wait, What just Happened?

If you look at your /admin/content page, you should see a single node with the title “Sample Page 1” and path /sample_page_1! Demo content! 🎉

Sure, this is just a node with a title and path alias for now, but we can experiment with changing things in the data_rows or process plugins to add field values and evolve this into something useful without much trouble.

Run `drush mr sample_nodes_page` (migrate:rollback) to roll-back the import (this will delete the node).

Then make your changes to file and run `drush cr` to reload the plugin definition. Tip: run migrate:rollback before changing the plugin YAML, because if you accidentally make a syntax error, the rollback command will fail when it reads the plugin file.

Ok, fine, but What About Paragraphs?

Yea, we use Paragraphs too. Who doesn’t?

I find this is right around the time people tend to get tripped up with Drupal migrations. When we edit a Paragraph in a node edit form, it looks like it is just part of the node. It’s all done right inline, after all. But there’s a lot of work going on behind the scenes to make it look seamless.

What’s really happening is that a paragraph entity separate from the node entity is created and managed for every paragraph item added to the node during editing.

If you’ve ever used Migrate before, you’re probably noticing the potential issue right about now: you cannot define entities inline in a migration. If this is a must-have for your demo content, you will really like Yaml Content.

It’s ok, though because Migrate is built to handle this and the system for relating entities to other entities is pretty easy to work with.

Defining Some Paragraphs

Before our Page nodes can render paragraphs, we have to actually have some Paragraph entities to reference.

My test project has a very simple paragraph type called “text” which has a single formatted text field. The sample github repository linked below also contains a set of sample field configurations describing everything used by these migrations.

id: sample_paragraphs_text label: Sample Paragraphs - Text migration_tags: sample source: plugin: embedded_data data_rows: - sample_id: text1 body_text: | <p>Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.</p> - sample_id: text2 body_text: | <p>Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.</p> ids: sample_id: type: string process: type: plugin: default_value default_value: text 'field_body/value': body_text 'field_body/format': plugin: default_value default_value: basic_html destination: plugin: entity:paragraph

File: migrations/sample.paragraphs.text.yml
Creating two Paragraph entities of type ‘text’. Note the destination plugin is now ‘entity:paragraph’.

That’s all well and good, but paragraphs have to be referenced by a field in a node to be rendered. This is where things get really cool. All we have to do is tell our node migration to look up the paragraphs created by this migration and place the entity_id that got created during mim into the ERR field in our nodes when they are created.

We’ll need to add three new things to the node migration to get this to happen:

  1. A migration dependency -- this will tell Migrate that the paragraphs have to be created before they can be referenced by the node migration.
  2. A new value in the source data_rows to tell the node migration which paragraph we want to reference in each row.
  3. A new process item for the destination ERR content field to tell Migrate which migration(s) to look up the target IDs from.

Those look like this, respectively:

  1. The migration dependency that tells the system we are going to reference something from another migration. migration_dependencies: required: - sample_paragraphs_text
  2. The source data for this node that the field_content process item will reference below. It has to be an array so sub_process can operate on it correctly. content_items: - sample_id: text1
  3. The process information for how the source data relates to the migration we want to perform the lookup on. In my page content type, there is an ERR field called field_content that can reference Text and Image paragraphs. field_content: plugin: sub_process source: content_items process: target_id: plugin: migration_lookup migration: sample_paragraphs_text source: sample_id target_revision_id: '@target_id'


Putting That all Together

See this example repo on Github for the new full version of the “Sample Nodes - Page” migration that will reference these new text paragraphs. I cut that out here in the interest of page length since it’s the same as above, plus the three new parts.

With both migrations in place, to run them at the same time, use the tag option: drush mim --tag=sample. Even though the machine name “sample_nodes_page” would sort before “sample_paragraphs_text” alphabetically, the migrations will be run in the correct order due to our migration_dependencies.

You should see something like this:

user@site-web:/var/www/html/web$ drush mim --tag=sample [notice] Processed 3 items (3 created, 0 updated, 0 failed, 0 ignored) in 0.8 seconds  (227.5/min) - done with 'sample_paragraphs_text'  [notice] Processed 3 items (3 created, 0 updated, 0 failed, 0 ignored) in 0.4 seconds  (468.1/min) - done with 'sample_nodes_page’ Hmm, How About File Uploads?

These are similar to the Paragraphs example in that the files will need to be migrated and then referenced from the parent entity, For this we need to use the file_copy plugin to actually get the file in place to be referenced by our Image field (which is in another paragraph type in my example).

id: sample_files label: Sample Files migration_tags: sample source: plugin: embedded_data data_rows: - fid: 1 filename: sample1.jpg - fid: 2 filename: sample2.jpg ids: fid: type: integer Constants: # You may need to change this path if you cannot place your module # in a directory called simply “sample”. files_path_rel: modules/custom/sample/files/ dest_path: public:// process: docroot: plugin: callback callable: realpath source: '.' files_path: plugin: concat source: - '@docroot' - constants/files_path_rel delimiter: / file_source: plugin: concat source: - '@files_path' - filename file_dest: plugin: concat source: - constants/dest_path - filename uid: plugin: default_value default_value: 1 fid: fid filename: filename uri: plugin: file_copy source: - '@file_source' - '@file_dest' destination: plugin: entity:file

File: migrations/sample.files.yml

This is a lot longer than some of the others, but let’s break it down using what we’ve already seen and introduce some Migrate conventions. It’s pretty simple to take one step at a time.

Start at the bottom; the “meat” here is the file_copy plugin. This is where the actual work occurs and the File entities are created from the source rows. We can ignore those for now, though. Let’s look at how the file_source and file_dest paths are built.

In Migrate, the source values used in process items can refer to other values that were produced from earlier process items. That’s what the '@file_dest' item is doing.

So, backing up to file_dest, that’s just a concat of a constant + the filename from the source row. Nothing too fancy, so our value would come out like “public://sample1.jpg”. This is then stored in the uri value after the copy.

The ‘@file_source’ path is similar, but needs a bit of runtime info, so it’s a concat of something calculated in another process: “callback”. If you know the absolute path of your build, you could just put the entire thing in a constant and simplify this, but I assumed it would be different between a local environment and a CI environment for completeness.

Putting it all Together Again, Again

Here is a sample repo I created with all the example files in one spot:

Happy migrating!

Notes on Other Options Yaml Content

Pros: Very well documented, which makes it easy to pick up and use; runs pretty fast, too.

Cons: Adds an extra dependency on your project; slightly esoteric yaml syntax for embedded entities; code is a little too abstracted/confusing.

Default Content

Pros: Seems very similar to Yaml Content, but for those who like hal+json.

Cons: Uses d8 rest (hal+json); not as well documented with examples - seems like the workflow almost requires building content to export to then turn around and use for an import.

YAML Content gives you a low barrier to entry and is very well documented. However, over time you may want more features and the example content can become a bit unwieldy (order of imports is a thing that has to be juggled sometimes, for example). Overall, it’s a solid choice for demo content.

External Links


Categories: Blog: Top Drupal blog posts from May 2020

Wed, 2020/06/17 - 12:26pm

Last month, the Drupal community was abuzz with anticipation of version 9. Our recap of May’s top Drupal posts, then, features a lot of those related to the release of Drupal 9, from the new Drupal brand to recollections of the Drupal 9 Porting Weekend. We hope you enjoy revisiting them!


OSTraining: How to Integrate a Sliding Menu in Your Drupal 8 Site

Wed, 2020/06/17 - 6:00am

The Sidr module for Drupal 8 allows site builders and themers to add one or more sliding menus to their sites in a very uncomplicated way. It makes use of the open-source Sidr jQuery library. Keep reading if you want to learn this useful module!

Keep reading to learn how!


Kalamuna Blog: Are you ready for Drupal 9?

Tue, 2020/06/16 - 5:55pm
Are you ready for Drupal 9? Jaida Regan Tue, 06/16/2020 - 08:55

Drupal 8 was one of the biggest releases in the platform’s history. Its release empowered the community to adopt more modern development technologies and tools. If you migrated a site from Drupal 6 or 7 to Drupal 8 you probably can attest that a considerable investment of time and resources was required for a successful migration. Luckily, migrating from Drupal 8 to 9 will likely be a walk in the park, in comparison. 

Categories Drupal Author Anya Mykhailova

Drupal Association blog: A Drupal Contribution Guide - Guest Post

Tue, 2020/06/16 - 5:15pm

We have been sent the following rather fabulous guest blog post by Yogendra Prasad, Surabhi Gokte, and Karthik Kumar and we wanted to share it with everyone here. We would love to see this inside the upcoming Contributor Guide (more details of which coming soon...)

What is the Drupal Community?

The Drupal Community consists of all the members using Drupal and coming together to form a group and voluntarily give their time to contribute towards bettering the community.

Why should I contribute to the Drupal Community?

Come for the software, stay for the community!

Drupal has been our bread and butter for so many years and so it’s one of our prime duties to give back to the Drupal Community in whichever ways we can.


  • You get to learn as you work with the community worldwide
  • You get to present yourself which in turn brings visibility to both you and your organization
  • You get to know about the Drupal events happening throughout the community.
  • You get to participate in the events by speaking or volunteering.
  • Basic understanding about Drupal
  • Have done local setup earlier
  • Know how to Install Contributed Modules
  • Must have basic knowledge of Git
  • Must know how to Create/Apply Patch
What is the Life Cycle of a Drupal Issue?

Following can be the different states of an issue:

  • Active - When a new issue is created, it is in an Active state.
  • Needs work - When the issue needs to be worked upon, it is in a Needs Work state.

One can pick the issues from either of the two states to start with.

  • Needs review - Once the issue is picked, the patches are successfully submitted and all the test cases are passing, the issue’s state should be changed to Needs Review.
  • Reviewed & tested by the community - Once the issue is reviewed by any contributor, the issue is moved to “Reviewed & tested by the community” (RTBC) state where one of the members from the core community team reviews the issue
  • Fixed - When an issue passes from the RTBC state, it is moved to the Fixed state
  • Closed (fixed) - After the Fixed state, the issue moves to Closed (fixed) automatically within two weeks. This is the last state of an issue.
  • Closed (duplicate) - When any issue gets created which is a duplicate of any earlier one, it directly gets closed as Closed (duplicate)
  • Closed (won't fix) - This state represents that an issue has no solution
  • Closed (works as designed) - This state represents that an issue raised is providing the functionality what it was supposed to and so moved to “works as designed”. In other words, the issue raised is not a bug but a feature.
  • Closed (cannot reproduce) - When an issue is not reproducible is moved to this state.
  • Closed (outdated) - When an issue is either too old to fix or gets fixed within some other module’s issue, the state can be Closed (outdated) for that issue.

Other States:

  • Patch (to be ported): When a Patch is to be ported for other versions of Drupal/Contributed module.
  • Postponed: When the Issues/Feature/bug is postponed by the author/community, and doesn’t need a fix.
  • Postponed (maintainer needs more info): When an issue is raised but according to the maintainer of the Contributed module, more info is needed about the issue to get it fixed.
What ways can I contribute to the Drupal Community?

There are multiple ways to contribute to the Drupal community, and you don't need to have a developer background to give something to the community.

  • Drupal Core issues: You can select issues from Drupal core from : to contribute. Here you can commit patches to the issue, review tickets that are in “needs review” and if you have found any issue in Drupal you can log here by providing basic details about the same.
  • Contribute a Module to Drupal: If you have any feature in mind that you think can be used in Drupal as a standalone module , so that a larger audience can start using it , you can add your module to Drupal. For creating and getting it approved you have to follow steps mentions here:
  • Contributed Module Issues: Along with DrupalCore issues , there is a huge list of bugs/issues which you can pick for fixing by providing patches from contributed modules:
  • Documentation: if you do not have any Development background or are not interested in contributing by writing code, another interesting way is to help in improving Documentation. Documentation in Drupal is needed in the form of README.txt in every single module/theme , in form of code commenting , class usage etc.
  • Validating Issues: If you are good at reviewing tickets or have QA background you start contributing to the Drupal community by verifying the fixes provided by any community member on the. For this you have to pick up tickets specifically which are in “needs review” status. List of issues you can found here: , This list contains both Drupal core and Contributed modules issues , it's up to you which you want to pick and start working.
  • Contribute financially: The Drupal Association is a nonprofit (501c3) organization serving the international Drupal community. One can also contribute to the Drupal Community in monetary terms by opting for the individual membership. Read more here -


  • Log new Issue: You can also log your own issues to The issue can be from both Drupal Core and Contributed modules. If you faced any issue while using Drupal in Core or any module you can go and log the issue directly to Make sure to not log a duplicate issue.
How to start contributing?

How to get registered with

The very first step is to register yourself on the website by creating an account. Go to If you already have an account, simply login at

How to get registered to Drupal slack?

There are various communication channels to connect with the community, the famous one is Slack!

If you do not have an account on Drupal slack, go to Once there, you can join various channels as per your requirements like - #support, #frontend, #d9readiness etc. You can also search for channels by clicking on “+” > “Browse Channels” in the Channels section.

There are other mediums too like Telegram, Rocket Chat, IRC, etc. where people connect.

How to find issues?

Go to Drupal’s issue queue and filter the list based on your area of interest:

You can visit to find the issues you want to work on, for this please login to so that you can use the Advanced filter feature which only appears for logged in users.

For Drupal Core specific tickets :

For Combined list of Core and contributed modules issues list:

Basic guideline to find issues:

There are few filter criteria we follow before picking up any ticket from the issue queue, so these criteria helps us to get to ticket/issue closer faster. Following are the criteria:

The filters on issue queue will look like this:

  • Use issue tags: Novice, documentation, Drupal 9 compatibility : IF you are new to Drupal contribution arena , and you want to start with easy and straightforward issues that will help you boosting your confidence , then you can filter the issue queue with “ Novice, documentation, Drupal 9 compatibility” issue tags. This will give you the list of issues which are tagged with following tags only.
  • Use the Component filter of your interest : You can make use of this filter to get a list of issues of your interest area. For ex: Bootstrap system, Ajax System ets.
  • Use Status filter : Active/Needs Work/Needs review/Patch to be ported : By Default Issue list contains closed issues associated with irrelevant status, so to reduce the list of the issue use the mentioned status.
  • Sort issues list on ASC order of replies : This Sorting will give you the list of issues which have less number of replies, means no one has picked or less people have worked on the issue.
  • Sort Issues list on DESC order of Last updated: This will give you list of most active issues on which you can start working , Ideally we not prefer to pick issues which have last activity more than a Year ago, as it implies the issue is less active and there are huge chances that the issue will take more time to get closed in comparison with issue will more active.
  • Pick issues with empty “Assigned to” value: Please Keep in mind that it is not advisable to pick any ticket on which someone is already working or the ticket is already assigned to someone.

Note: You can use this URL to get directly to the filtered issue queue.

What to do after finding the issue?

As you identified the issue on which you want to work/contribute you can follow the following steps to help the issue to get more close to its closure state.

Note : Keep in mind about the version of Drupal Core/ contributed Module for which the issue has been raised by the author. You can get the version on the issue detail page as shown in image.

In the Given Image the Issue project is Drupal Core and the version for which the issue is reported is 8.8.x-dev.

How to Setup a Local machine to replicate the issue?

Now you have to be ready with your local setup of the machine. Local setup is necessary in case you want to give a patch to the issue or even if you want to test a patch on a given issue.

As per mentioned above, you have to set up an exact version of Drupal Core/ Module on your machine for which the issue has been reported.

  • If the issue belongs to Drupal core and you need to set up a specific version of Drupal core you have to follow : to setup Drupal.
  • If the Issue belongs to the Contributed module , in this case setup Drupal with the latest stable version available on your machine and then clone the module which is needed to replicate the issue. For ex in the given image below you can see the module version for which the issue has been reported.

Now you have to get this specific version on your machine to replicate the issue and to fix, for this visit the detail page of the module as given in the above image , and on the detail page you will have a link for version control. For example, have a look at the image below:

When you click on version control you will get page as given in the below image:

Now you have to follow the steps and clone from the specific branch for which the issue has been reported.

Note: Make sure you have Drupal Installation already and make sure to place/clone the module at the proper directory so that you can use the module and fix the issue.

  • Get understanding of issue: First step is to get complete understanding about the issue , try to replicate the issue on your local machine for the same version for which the issue has been reported.
    • In case of any query, put appropriate comments to the ticket and ask for more details..
      • Try to replicate the issue in your local environment.
      • Once you have a clear understanding of the issue and have an idea /approach to fix it , Assign issue to yourself so that other community members will not take it up.
      • Add necessary tags to issue: Usually in code sprints and Drupal events when we pick any issue we add a relevant tag to the ticket so that we get to filter the list of tickets from the tag and get a list of issues picked in that particular event.
      • Depending on the state of the ticket, start working on it.
How to Contribute by adding a patch to the issue?
  • Once you have started work on the ticket, you might be able to get the resolution or you might have some more questions.
  • Comment back in ticket, with the questions and follow up ticket.
  • If you are having resolutions, that would be a piece of code change / style fix / Adding Readme / Annotation changes etc.
  • There might be issues, which already have some patch, at that point you have to add changes to the existing patch and get the interdiff.
  • After the issue is fixed, test it locally and create a patch.
  • Try to execute the unit test on your local machine , to make sure that unit tests are working fine.
  • If you have to write any test cases for your changes, do the same.
  • Your patch attaching to ticket will be like either one of the below
    • Patch file + comments
    • Patch file + Interdiff file + comments
How to create a patch?
  • Use GIT command to create patch
  • In case of new patch
    $ git diff $ git diff > <ticket-number>-<comment-number>.patch

    • In case of an existing patch, download it.
      $ wget <patch-url-from-ticket-to-download> $ git apply -v <download-patch>

      Make the changes needed existing patch

      $ git diff $ git diff > <ticket-number>-<comment-number>.patch $ interdiff <old-patch> <new-patch> > interdiff_<old>-<new>.txt
      • Naming of patch
        • Patch name
          • Interdiff name
            • Validate patch on your machine
              $ git apply > <new>.patch
How to attach a patch to Issue?
  • Add patch as a file to issue with needed information in comment
  • Update ticket status to Needs Review after Applying patch and unassigned it
  • If patch turns green and passes all test cases issue is ready to be reviewed by community

    • If patch fails then look into the logs and try to rework on the issue by assigning back to yourself.
    • In case need help seek help from community via adding comment to ticket or on Drupal slack
How to Contribute by reviewing an issue?

There is a great opportunity for folks who do not want to contribute to Drupal by giving patches, instead you can start reviewing/ testing patches that are getting applied for corresponding issue


  • Get the issue list filter with “Needs Review”: To Review any patch provided for the issue you have to filter the issue queue using “Needs Review" for both Drupal Core issues and Contributed module issues.
  • Identify the issue on which you can do the testing/reviewing.
  • Make sure to pick the latest patch applied to the issue, As given in the image below:

As you can see there are lists of patches applied for this issue , but you have to work/review the latest patch applied to the issue ,which will be sorted by Comment ID in DESC order.

  • Make Sure the Patch has passed all the Unit test cases and have turned to green color , like in the given image.
  • If possible you can have a look at the patch for the code changes done to fix the issue. You can have a look to analyse the coding standards, indentations , logic or any technical debt.
  • If you have any comment/feedback/suggestions on the patch , just go ahead and add a relevant comment to the issue regarding the patch.
Reviewing Patch Using “Dreditor”:

There is a Browser plugin which you can use to test the patch on virtual/temporary setup using, Setup this Plugin on your Browser. Make sure to restart your browser before using it.

Now when you visit to Issue detail page which you already shortlisted for reviewing , you will able to see image as below :

Two new action buttons will be displayed.

  • Testing/verifying patch using As you want to test the particular patch you have to now click on “" button to initiate setup process.
  • After clicking you will be redirected to page as given in image:

  • Here you have to select the version for which you want to validate the patch, Version will be as same as mentioned on the issue.
  • After selecting the correct version , click on “Launch SandBox”. There will be a process to setup a environment will initiate, like in image given below:

This will take some time to complete.

  • Once the process is completed you will land on the Drupal site , which already have the patch applied to the code base.
  • Admin Login: If you want to login to the system you can use: admin/admin as username/password for the system.
Reviewing Patch on a Local machine:

It is always better to test/verify the patch locally instead of using For this you have to follow the following steps.

  • Setup local environment: Once you are ok with the Code changes done in the patch you can start setting up your local machine for the corresponding version mentioned in the issue for Drupal Core/Contributed Module. Follow the steps given above to set up your local machine.
  • Apply Patch on Local Environment: Now you can download patch by clicking on the patch link directly to your local machine and then apply the patch on your machine using :
    git apply > <patch-file-downloaded> Note: Make Sure about the path on which you are while applying the patch.
Validating/Testing the Patch (Cloud/Local):
  • Testing: Test the patch locally and verify that the issue has been fixed after applying the patch. Try to test for regression as well.
  • Patch is working fine and the issue got Fixed: If you feel the patch is correct and working fine as per your expectation and as per the criteria mentioned in the issue , then update the status to “RTBC”(Reviewed and tested by community) from Needs Review. Add Needed comments/Images/Videos to support your points.
  • If you feel you need more eyes needed to help you then make a comment to the tickets about the steps you took for testing and ask for more people to look into it.
  • Seek help on Drupal slack in case needed.
  • Issue is not fixed/Bug still exists: If you see the bug/issue is still appearing and not fixed by the patch, then add your relevant comment by tagging the author of the patch in comment and move the ticket status back to “Needs Work”.
Next Steps after Contributing to Issue (Reviewed/Applied Patch):

Once you have done your job on the issue, do the following:

  • Put a proper comment on the ticket about your work. Add screenshots if necessary.
  • Move the ticket to the next relevant state i.e from “Needs Works” => “Needs Review” or from “Needs Review” => “RTBC”
  • Attribute your contribution by adding an organization and customer, if appropriate. Refer screenshot below:

Also, you should keep following points in mind to check for any updates on the issue:

  • An update to the issue can be a new comment , Information updated for the issue or state changes to the issue.
    • Keep monitoring your Dashboard: provides you a personal Dashboard, which contains your latest activities and issues list on which you recently contributed. You should keep a close eye on the Dashboard and see the progress on issues you worked on.

      You can access your dashboard after you logged in to the site and hover on your user image on the header section as displayed in image below:

  • You will be receiving all updates on the issues via mail on your mail ID
  • In case needed i.e if ticket status is moved back to Needs work/Needs Review and you feel you can work/contribute to it , Then you can pick the issue and work on iit accordingly.
  • If the status of the ticket is closed, no need to do anything now on this ticket.
When is a Credit received for the contribution?

NOTE: It's not mandatory that you will always receive credit for a closed issue on which you worked.

Once the issue is marked Fixed, the maintainer chooses which contributors on the issue will receive credit. Usually these are the contributors who...

  • When you submit a successful patch for an Active or Needs Work issue
  • When you review any existing patch of a Needs Review issue
  • When you create a new issue
Categories: blog: How maintains geo-redundant remote backups with ease thanks to

Tue, 2020/06/16 - 4:01pm

The following case study was written collaboratively by hestenet and nnewton, explaining how we use to manage backups for The Drupal Association used for many years prior to any partner relationship, and is now proud to count among our Technology Supporters. has been the home of the Drupal community for many years. Online since 2001, and fed by a global community of contributors, there is a tremendous amount of open source history recorded here.

It's critical that we safeguard that history for posterity, and of course all of our current activity so that we can maintain the momentum of the Drupal project.

Naturally, we've done a tremendous amount of work to make our infrastructure robust and fault tolerant from the top of the stack to the bottom. Individual servers use RAID storage, our infrastructure is built using highly-available pairs, and the Oregon State University Open Source Lab, our data center, has good data center hygiene and redundant power and cooling.

But disasters can and will happen, and this is why off-site backups are critically important. uses to manage off-site backups, and we highly recommend them as a solution. is built on ZFS, a file system we have experience with and trust to be durable and offer cheap, immutable snapshotting. gives you an empty filesystem to do anything you want with and works with any SSH or SFTP based tool. This standard approach allows us to easily use the service with existing tooling. We have used for various purposes for almost ten years and have not had a single incident.

How exactly do we use is actually configured as our primary backup location for all of the infrastructure. In addition to this, because we take advantage of's geo-redundancy feature, provides those backups in multiple, separate data centers. We also use for a secondary backup layer for some select data pools that are already backed up in Amazon S3, or on the Open Source Lab's backup servers.

How do we have it configured?

For the infrastructure we use BorgBackup ( to manage compression, encryption, and deduplication of our backup data. We then entrust with ZFS snapshotting of the Borg data, providing us with points in time to easily roll back to. This gives us a sliding window of encrypted backups. It also gives us protection from malicious actors or ransomware as the snapshots are immutable, or read-only.

The actual execution of borg and tracking of backups is done using a bash script that is placed on each server by our Puppet tree. Puppet also places a private key for encryption and the appropriate ssh private key for access. We wrap Borg in a script due to our need to cleanly initialize new vaults when we spin up a new server, as well as our need to monitor Borg execution. One thing we have found is that it is difficult to detect silent failure for Borg specifically, so we have multiple points of feedback in the script. Our script functions as follows:

  1. Check if a vault on exists for this host, if not create it.

  2. Backup the paths passed to this script to said vault for today’s date.

  3. Check the return of the last command, if it has failed email our monitoring endpoint to trigger an alert.

  4. Use Borg to pull statistics from the last backup, such as count of files backed up, chunks backed up, size of backup, etc.

  5. Massage those statistics into usable metrics and send them to statsd, where they will end up in our monitoring system.

With this process we both have an alert in the time of failure and also can create alerts based on the graphs we create from statsd. We do this to catch times when a backup may have succeeded but the amount of data backed up dramatically fell. That “success” may have been a failure in that case, just not an obvious one.

Example dashboard built from statsd information:

A major reason we value is it presents a simple/standard ssh interface that allows us to use tooling we can customize to exactly what we need, as in the above example.

Why would we recommend this to others?

We share the Drupal community's love of simple, elegant, and technically excellent solutions to problems. Configuration and backup management with ticks all of those boxes, and further the business-side is run in a very friendly way, with frequent increases in capacity available at very reasonable rates.

It's proven to be an effective and affordable way to use the funding we receive from the Drupal community to protect the project, and we believe you can trust it to protect your own projects as well.


Specbee: Working with the Devel module in Drupal 8 to Generate Dummy Content

Tue, 2020/06/16 - 2:04pm
Working with the Devel module in Drupal 8 to Generate Dummy Content Karishma 16 Jun, 2020 Top 10 best practices for designing a perfect UX for your mobile app

The Devel module in Drupal 8 is an extremely useful module for developers. The Devel module includes three submodules – the Devel Generate module, Webprofiler module and Kint modules. In this article we will delve into the Devel Generate module and how to work with it.

When building a Drupal website, you want to have a lot of content to check the overall display such as layouts, views, design. Content come in different forms and so it becomes important to test the website out with dummy content before adding live content. Instead of creating dummy content manually, wouldn’t it be nice to add them automatically? That’s where the Devel Generate module for Drupal 8 will come in handy. 

What does the Drupal Devel module do?

Drupal 8 offers tons of helpful modules that can ease the job of a developer. Devel module has several useful features. As discussed previously, there are several modules that are part of Devel; One such is the Devel Generate module. Devel Generate is used for automatically generating sample or dummy content like menu items, taxonomy terms, and nodes. This is helpful if you need to test or showcase your Drupal website with data like dummy users, content, images, and so on. Drupal Devel module can create it all for you in a moment. Devel and its associated submodules are meant to be used on local development environments and should not be enabled on live production sites.

Getting started with the Devel Generate module

Working with the Devel module is as simple as installing it. Let’s get started with the installation and how to use Devel module in Drupal 8. 


Installing the Devel Generate module for Drupal 8 is like installing any other contributed module. I’m using the Composer to install it since it automatically installs all of the necessary dependencies. Open the terminal, within the project enter the following command.

$ composer require drupal/devel Next, enable the Devel and Devel Generate module.
Generate content using the Devel UI

 1. Go to Configuration. Here you will see a long list of options under Development. Choose Generate Content.

 2. The Generate content interface offers a number of options to set before you hit GENERATE:

  • Generate content for all content types or select for specific content type
  • Check the checkbox if want to delete all content of a certain content type before generating new content 
  • Specify how many nodes to generate and how far back the node creation dates should go
  • Specify the maximum number of dummy comments for each node that support comments
  • Set the maximum number of words for the title of each node
  • Hit Generate


3. You can see the list of dummy nodes generated by going to admin -> content

4.This is what an article looks like with a dummy title, image, and body content

Helpful hint: If taxonomy and tags are defined, the generate content module will assign tags at random. If you create vocabularies and terms first and then generate the content, you will end up with more useful test data. Devel can generate vocabularies and terms automatically, but they will be in fake Latin. It’s better to create your own vocabularies and terms because it makes testing easier when you have a meaningful taxonomy to work with.

The Devel module in Drupal 8 is a huge time saver for Drupal developers. This article should have given you a brief overview of how the Drupal 8 Devel module automatically generates bulk of dummy content for a development site. To know more about us and how we can help you leverage the powerful features of Drupal, contact us now.

Drupal Planet Shefali ShettyApr 05, 2017 Subscribe For Our Newsletter And Stay Updated Subscribe

Leave us a Comment

  Shefali ShettyApr 05, 2017 Recent Posts Image Working with the Devel module in Drupal 8 to Generate Dummy Content Image Programmatically creating a block in Drupal 8 – A brief tutorial Image Drupal 8 Custom Module development – A beginners Guide Want to extract the maximum out of Drupal? TALK TO US Featured Success Stories

Know more about our technology driven approach to recreate the content management workflow for [24]


Find out how we transformed the digital image of world’s largest healthcare provider, an attribute that defined their global presence in the medical world.


Discover how a Drupal powered internal portal encouraged the sellers at Flipkart to obtain the latest insights with respect to a particular domain.

Categories: Customise scaffold files the right way

Tue, 2020/06/16 - 11:22am

There are some key files like robots.txt and .htaccess which are often tweaked for Drupal websites. These can be considered part of the 'scaffolding' of a site - they control the way the site works, rather than its content or design. Any new release of Drupal core that includes changes to them specifically mentions that they need updating, as those changes may have to be merged with any customisations made on your site. For example, there was a security release that added rules to .htaccess, which were essential for any site to incorporate and the template settings file, default.settings.php, also gets regular updates which are easy to miss out on. The new Drupal Scaffold composer plugin can now ensure that these files are always up-to-date by default. But that can mean it's now too easy to lose customisations, as those files are taken out of our direct control. (They now behave like files from external dependencies, which are usually excluded from version control.)

It's not a good idea to 'hack' (i.e. make changes to) core files. Drupal developers even dissuade each other from doing this by joking about bad things happening to kittens! But while these scaffolding files may come from core, they all live outside of Drupal 8's /core directory. (A full list of these files is near the bottom of this article.) This leaves them vulnerable to the forgetful developer coming along and tweaking them without thinking. To be fair, it's quite right to expect to be able to tailor them for SEO, specific business requirements, performance gains, debugging needs or whatever.

So the Scaffold composer plugin provides some ways to customise these files in a 'nice' way, all of which require some little edits to your project's root composer.json file.

  1. Simply append or prepend some lines

    Create a file containing the lines that you want to add, and reference it within the 'extra' section:

    "extra": { "drupal-scaffold": { "file-mapping": { "[web-root]/robots.txt": { "append": "assets/my-robots-additions.txt" } }, ... } }

    Replace 'append' with 'prepend' as the key if needed. This is great for robots.txt, which usually just wants some additions beyond what Drupal normally provides. I've used it for default.settings.php to suggest some useful project-specific config overrides for developers.


  2. Override a file entirely

    Create the file you want to use instead of core's version, and reference it within the 'extra' section:

    "extra": { "drupal-scaffold": { "file-mapping": { "[web-root]/robots.txt": "assets/robots-override.txt" }, ... } }

    This loses out on any improvements that Drupal may add over time, but is handy if you want to take back control of the file entirely. For example, some SEO agencies like to determine the contents of robots.txt entirely (although the RobotsTxt module may be more useful for that).


  3. Patch a file

    Create a patch of changes that you want to make, and use the post-drupal-scaffold-cmd script event hook:

    "scripts": { "post-drupal-scaffold-cmd": [ "cd docroot && git apply -v ../patches/my-htaccess-tweaks.patch" ] }

    This is really useful if you have specific changes to merge into a specific place of a scaffolded file, like in .htaccess. This ensures you get the benefit of updates made by core to the file.

    Pro tip: run composer install; git diff -R .htaccess > patches/my-htaccess-tweaks.patch to produce the patch if .htaccess is still under version control!


Once these are in place, you can then ensure to remove and exclude all the scaffolded files from version control, if you haven't already. Here's example commands you could run to remove them. Make sure to replace 'docroot' with your webroot subdirectory.

# Commands to remove scaffolded files git rm .editorconfig .gitattributes --ignore-unmatch; cd docroot; git rm .csslintrc .eslintignore .eslintrc.json .ht.router.php .htaccess index.php robots.txt update.php web.config modules/README.txt profiles/README.txt themes/README.txt example.gitignore INSTALL.txt README.txt sites/README.txt sites/ sites/example.settings.local.php sites/example.sites.php sites/default/ sites/default/default.settings.php --ignore-unmatch

...and a snippet you could paste into your project's .gitignore file. (Again, replace 'docroot' if necessary.) This should then be committed for this to all work out.

# Lines to add to your project's .gitignore file. # Files from the Drupal scaffold for composer. /.editorconfig /.gitattributes docroot/.csslintrc docroot/.eslintignore docroot/.eslintrc.json docroot/.ht.router.php docroot/.htaccess docroot/example.gitignore docroot/index.php docroot/INSTALL.txt docroot/README.txt docroot/robots.txt docroot/update.php docroot/web.config docroot/sites/README.txt docroot/sites/ docroot/sites/example.settings.local.php docroot/sites/example.sites.php docroot/sites/default/ docroot/sites/default/default.settings.php docroot/modules/README.txt docroot/profiles/README.txt docroot/themes/README.txt

A current list of the files can be found in core's composer.json file.

Good luck - you can now rest assured that the Drupal kittens will rest in peace 😅


Photo by Dan Diza on Unsplash


Drupal CMS Guides at Daymuse Studios: Drupal VM vs. Acquia Dev Desktop for Local Development in Drupal 8

Tue, 2020/06/16 - 6:59am

For several years, I've done all my local development with the help of Acquia's Dev Desktop. It's a specialized *AMP (MAMP, for me!) suite for Drupal that'll let you launch new projects rapidly. It's free to use and doesn't need any specific integration with Acquia. If you're working on your own independent Drupal project, it can still be a great solution. I was recently roped into testing a virtual machine for Drupal via the Drupal VM project.


Kristen Pol: Look who's talking about Drupal 9

Tue, 2020/06/16 - 12:01am

I was researching Drupal 9 content and decided to collect some data. I gathered information related to the authors and the organizations whose content showed up in the top 100 Google results for "Drupal 9" (as of June 12, 2020). Check out the infographic with the data in a pretty format or skip to the plain text.

This is the first infographic I've created. I used Piktochart based on a Twitter recommendation from @akmalfikri. I liked the tool pretty well but was hoping for more mapping options based on region rather than country. If you have a favorite infographic tool, let me know.

read more


Evolving Web: Drupal 8 vs. Drupal 9: More Features for Content Editors  

Mon, 2020/06/15 - 4:11pm

The recent official launch of Drupal 9.0 represents 4 and a half years of improvements (and more than 4,500 individual contributors) to the open source CMS designed to support the most ambitious digital experiences. The official party line, so to speak, is that "the big deal about Drupal 9 is that it's not a big deal."

If you're currently on Drupal 8, you can expect a simple, painless update that'll allow you to stay up to date with upcoming feature launches (Drupal 9.1.0 is planned for December 2020) and continuous security support. Not only that, but the power of Drupal is more accessible than ever to people without technical backgrounds. Engineering teams continue to benefit from the latest features and improvements made since 8.0. And everyone relies on the security that comes with updating the underlying technology stack.There have been so many improvements to Drupal's overall user experience since 8.0 was first released that there's plenty to celebrate.   

Here are five features that make Drupal 9.0 the most accessible, intuitive, and user-friendly version yet—both for marketers using it to publish content and developers tasked with maintaining the code.  

  1. Visual page design with Layout Builder
  2. Intuitive media handling
  3. Customizable content moderation workflows
  4. Claro, a sleek, accessible core admin theme
  5. API-first architecture, featuring the JSON:API  
1. Visual page design with Layout Builder

The Layout Builder.

Drupal's Layout Builder lets content editors build and modify pages visually using drag-and-drop, eliminating a lot of reliance on developers and speeding up marketing workflows. Using intuitive, block-style layout controls, designers and editors can:  

  • Build default page templates for different content types (e.g. blog posts or feature pages)
  • Autonomously override default settings when a small change to the usual layout is needed
  • Create structured single-use landing pages (e.g. for an offer or an event) that don't necessarily follow a default template  

The Layout Builder is one of many examples of Drupal's renewed focus on accessibility and ease of use. It represents a major step forward for Drupal's user experience, distancing the CMS even farther from its past reputation of being intimidating to first-time or non-technical users.   

Keep in mind that Layout Builder is an optional module in Drupal that needs to be enabled. If you need help setting up a visual page designer for your editors, you can always reach out to our team of Drupal experts.   

2. Better Media Management

Media Library shown in table and grid views.

Drupal 9's WYSIWYG Media Library management system lets content editors and designers collaborate on images, videos, and other assets in an intuitive interface. Beyond the GUI, of course, the Media Library is fully customizable: you determine which fields to require for each type of media depending on your needs.   

Drupal's superior taxonomy handling extends to the Media Library, making it easy to organize libraries of all sizes according to whatever system works for your teams. With the Media module, files, images, videos, and all other asset types are treated like pieces of content, meaning they support fields and versioning just like a page would.  

Two views are available for the Media Library: table, which offers a more detailed look at each file's metadata, and grid, which displays an uncluttered overview of assets. You can choose which fields to display for both views. Here are full instructions for customizing your Media Library's interface.   

Keeping with the Drupal spirit of power-meets-accessibility, there are two ways to add media from the Media Library into a piece of content: via a reference entity (aka Hard Mode) or via the WYSIWYG text editor (9 out of 10 editors recommend this option). Here's how to set up CKEditor so it supports the media embed button. 

3. Content Moderation Workflows

Content moderation workflow.

If you have a website today, you're either putting out content on a regular basis or hoping to start doing so ASAP. Drupal helps content and marketing teams save time and streamline their moderation and publication process by enabling workflows that match their actual on-the-job needs. (Workflows have also undergone UX and accessibility improvements that make them more intuitive, meaning that once they're set up, people will actually use them--and see their value.)  

By default, content in Drupal can be in one of two states: Published or Unpublished. With the core Workflows module, you can add custom states (such as Unassigned, Assigned, or Draft) beyond the default two to match your editorial process.   

The Moderated content tab in Drupal 9's administration interface.

The companion Content Moderation module then lets you assign roles and permissions to those new states and transitions. These flexible role-based configurations mean, for example, that an editor just needs to access the Moderated content tab to see if any new drafts are ready for review.   

4. Claro Admin Theme

The Claro theme.

Claro, a sleek new theme for the admin UI, is available as the default admin theme in Drupal 9.   

What's so great about Claro? It overhauls the current Seven theme with a sleek UI that adheres to the new Drupal Admin Design System. In a nutshell, Claro aims to be more accessible, responsive, user-friendly, and visually appealing.  

The Claro vs. Seven theme.

The efforts behind building Claro were bolstered by the findings of the Drupal Admin UX Survey, an initiative conducted by the Admin UX study group (whose members include Suzanne and Annika from Evolving Web!) back in 2018 to learn more about how content editors were using Drupal. Read Suzanne's overview of the survey's findings for more on the editor pain points that led to Claro's development and implementation in Drupal core. (And stay tuned for Olivero, a brand-new theme that's slated to bring that modern look and feel to Drupal's default front end—it's one of the Drupal core initiatives, and we're hoping to see a release in 9.1 so we can collectively say "goodbye, Bartik" for good.)  

5. API-First = Future-First

An example of an augmented reality app.

A lot of improvements leading up to Drupal 9 have focused on creating more accessible experiences for a wider range of users, but this one's especially for the technical folks—although it has exciting implications for content creators.   

Drupal has always been the platform of choice for web projects with rich content requirements. Being API-first makes it flexible enough to handle more ambitious projects in the future.   

Drupal 9.0 is compatible with the latest technologies and frameworks (such as React, Angular, and Vue). Not only that, but its architecture is suitable for building headless applications that integrate via APIs with emerging channels and interfaces like augmented and virtual reality, wearable devices, and digital assistants.   

Thanks to the ability to package and export structured data via the built-in JSON API, engineering teams can choose to either use Drupal as a traditional coupled CMS, or as a headless CMS with a custom front end.   

What does this look like in practice? The possibilities here are quite literally endless. A quick example might be a city wanting to use existing data to build an augmented reality app that lets tourists interact with different landmarks. Drupal would be able to leverage structured JSON data from the city's existing database and inject it into the app's UI.    


So, while it's technically true that there aren't any new features launching with Drupal 9.0, thanks to its developers' commitment to constant improvement and updates, today's Drupal is still leagues ahead of Drupal 8.0. And there's more to come before year's end. Drupal releases new features twice a year, so you can expect some shiny new (and accessible!) goodies delivered with 9.1.0 in December.  

There's never been a better time to dive into this powerful, open-source CMS. Ready to learn Drupal? Attend our upcoming webinar, What You Need to Know About Drupal 9, or sign up for our Drupal 9 training course for a deeper dive.

+ more awesome articles by Evolving Web

Tag1 Consulting: How to work with JSON-RPC, derived schemas, and API documentation - part 3

Mon, 2020/06/15 - 3:45pm

For several years now, decoupled Drupal has been among the topics that has fixated members of the Drupal community. At present, there is no shortage of blog posts and tutorials about the subject ...

Read more preston Mon, 06/15/2020 - 06:45

Web Omelette: New contrib Drupal module: Composite Reference

Mon, 2020/06/15 - 2:36pm

Today I want to introduce a new contrib module called Composite Reference. Why is it called like this? Because it’s meant to be used for strengthening the “bond” between entities that are meant to live and die together.

In many cases we use entity references to entities that are not reusable (or are not meant to be). They are just more complex storage vehicles for data that belongs to another entity (for the sake of explanation, we can call this the parent). So when the parent is deleted, it stands to reason the referenced entity (the child) is also deleted because it is not supposed to exist outside of the context of its parent. And this is a type of composite relation: the two belong together as a unit. Granted, not all parent-child relations are or have to be composite. But some can and I simply used them as an example.

So what does the module do? Apart from the fancy name, it does nothing more than make an entity reference (or entity reference revisions) field configurable to become composite. And when the relation is composite, the referenced entity gets deleted when the referencing one is deleted. And to prevent all sorts of chaos and misuse, the deletion is prevented if the referenced entity is referenced by yet another entity (making it by definition NOT composite). This should not happen though as you would mark relations as composite only in the cases in which the referenced entities are not reusable.

And that is pretty much it. You can read the project README for more info on how to use the module.

This project was written and is maintained as part of the OpenEuropa Initiative of the European Commission.


DrupalEasy: Automatically remove the Drupal core README (and other) scaffolding files

Sun, 2020/06/14 - 5:44pm

When creating a Drupal 8 or 9 project using the drupal/recommended-project Composer template, you may notice during certain Composer commands that the scaffolding files are copied from an "assets" directory inside of Drupal's core directory to their proper place in the codebase. 

But, did you know that the plugin that manages this process, drupal/core-composer-scaffold, can be easily customized in your project's composer.json file to not copy some of the scaffolding files?

For example, if you don't want the core README.txt scaffolding file created, then all you need to do is add the following to the "drupal-scaffold" data of the "extras" section of your project's composer.json:

"file-mapping": { "[web-root]/README.txt": false }

The syntax of the file-mapping is simply: "to: from". Meaning, copy the [web-root]/README.txt from "false" (nowhere). 

The power of the scaffolding plugin doesn't stop there - you can also add additional scaffolding files and modify existing ones, all automatically.

Want to learn more about Composer? Check out our Composer Basics for Drupal Developers workshop.


DrupalEasy: DrupalEasy Podcast 232 - Ted Bowman (Project Update Bot), Michael Schmid (Amazee Lagoon)

Sat, 2020/06/13 - 6:03pm

Direct .mp3 file download.

Ted Bowman, Drupal core contributor and member of Acquia's Drupal Acceleration Team (DAT) talks about the Project Update Bot. Also, Michael Schmid, the CTO of joins us to talk about their Drupal hosting solution, Lagoon, and how it fits in to a best-practice focused Drupal development workflow.

URLs mentioned DrupalEasy News Subscribe

Subscribe to our podcast on iTunes, Google Play or Miro. Listen to our podcast on Stitcher.

If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.


Drupal In the News: How to Contribute to Open Source: The Ultimate Guide

Fri, 2020/06/12 - 11:26pm

Recently, Heather Rocker and Tim Lehnen of the Drupal Association, with other open source experts, shared with Builtin tips on the best way to contribute to—and start building relationships in—open source.
Involvement with open source can be a rewarding way to learn, teach, and build experience in just about any career/skill you can imagine.

“We want you here. All the leaders within the community want that,” says Tim Lehnen. “There are a lot of people, especially in Drupal, who built their whole careers out of showing up on IRC 14 years ago and saying, ‘Hey, is there something I can help out with?’”

There are plenty of ways to get started with open source. But there are also many questions to answer to find your place: What does it mean to contribute? Breaking into new communities as a new contributor can be hard, and project leaders know that. How do you find the right project? How do you orient yourself to a new project? What if you don’t know how to code? What if something goes wrong?

Professional growth and building relationships in open-source spaces is, in some ways, easier than in traditional work environments. Lots of open-source projects are also taking steps to make their communities more inclusive and diverse. Building a community that encourages people to use, contribute to, and evangelize your project is very important.

Open source is made by people like you: one issue, pull request, comment, or high-five at a time. Read the full article to get inspired in getting involved with open source.


OpenLucius: A custom Drupal AJAX form in a working example module (Also sends emails and validates email address)

Fri, 2020/06/12 - 2:47pm

While building this Drupal 9 site a few months ago we needed to handle the contact form. Normally we would use the Drupal Webform module, but that wasn't Drupal 9 ready enough at the time. So we decided to build the form ourself, AJAX based to make it as user friendly as possible and learn a few things. We published the complete code in a working, installable Drupal module. A small explanation:

So the Drupal example module contains:


OpenLucius: A custom Drupal AJAX form in a working example module (Also sends emails and validates email address)

Fri, 2020/06/12 - 2:47pm

While building this Drupal 9 site a few months ago we needed to handle the contact form. Normally we would use the Drupal Webform module, but that wasn't Drupal 9 ready enough at the time. So we decided to build the form ourself, AJAX based to make it as user friendly as possible and learn a few things. We published the complete code in a working, installable Drupal module. A small explanation:

So the Drupal example module contains:


OpenSense Labs: The Ultimate Drupal SEO Guide for 2020

Fri, 2020/06/12 - 1:49pm
The Ultimate Drupal SEO Guide for 2020 Tuba Ayyubi Fri, 06/12/2020 - 17:19

Your website is the centre of the marketing world. No doubt SEO is very important for any business that operates online. Ensuring that a search engine is easily able to understand your content is the first step towards ensuring your visibility in search engine results.

Drupal is perfect for SEO. A Drupal website will help you launch your website and drive customers in as soon as you launch. Drupal provides many built-in SEO friendly features. One of the greatest features of Drupal is that it is super easy to customize.

The SEO essentials

Following are the most important pointers to tick-off when considering to boost search engine rankings of your Drupal website:

Checklist essentials SEO Checklist

The SEO checklist module does not add any sort of functionality to your website but it helps as a reminder to all your SEO related tasks. This module will make sure that you are maximizing SEO for your site. It is updated frequently with the latest SEO techniques. If you work on a lot of websites, this module is for you!

Real-Time SEO For Drupal

This module does your work of optimizing content for your website by including keywords in a fast and non-spamming way. Also, this module works best with the Metatag module. Real time SEO for Drupal modules will check if your posts are long enough, if you have written the meta description and whether or not it contains keywords or if you have used any sub-heading for your post.

Require On Publish

This module is required only when your content has to be published or is already published. Require on Publish can be used when there are options like tags or SEO information on your content that don’t generally need to be filled until the content is published.

URL Essentials Redirect

If you want to handle duplicate content, the Redirect module will do it for you. It helps in integration with Drupal’s page cache to optimize redirects and performance. 


This module makes sure that the URLs are search engine friendly. Search engines like Google and Bing encourage the use of cleaner and URL friendly websites. Pathauto converts complicated URLs to cleaner and clear URLs.

Easy Breadcrumb

Easy Breadcrumb is a plug and play module that uses the current URL and the current page title to automatically extract the breadcrumbs segments and its respective link. 

Menu Breadcrumbs

This module provides substantial benefits for both users and the search engine. Menu breadcrumbs will let you know where you are in the navigation hierarchy and also there is an anchor text in the breadcrumb which links it to the appropriate URL. 


The Footnote module will automatically create tallied footnote references into an article or post. It provides an optional feature that identical footnotes are collapsed into one as if they had the same value.

Link Checker

A broken link will put a bad impression on the search engines. The Link checker module keeps checking for broken links periodically by checking the remote sites and evaluating the HTTP response code. It will also extract links from your content when saved.

Menu Attributes

This module is helpful in your SEO strategy if you want to non follow any menu item to mould the flow of page rank. Menu attributes module is also helpful when you want to give your menu item an ID so that you can easily select it using jQuery. 

Tag Essentials Metatags

This module provides a global setting which controls the meta tags on all pages. Metatags module is also multilingual which is great for your website!


Hreflang is a simple module that adds tags that are used by search engines to serve the correct regional URLs in the search results. Metatag

This module is an extension of Drupal’s Metatag module to display structured data in the form of JSON LD in the head of web pages. Metatag module helps you define default structured data value for all content types. 

Power Tagging

This module helps you customize your content from Drupal nodes easily and provides multilingual tagging. It helps you tag your content in bulk at once with the help of Bulk-tagging. Power tagging module suggests tags which the users can curate or index collections of Drupal content.

Similar By Terms

If you want to enable the keyword tags on content then Similar by terms module is the right one! It also allows you to create your own views and it will also interpret content automatically through thesaurus even if synonyms are used.

OG Tags

OG tags are designed to make sure there is good communication between your site and Social media platforms like Facebook, LinkedIn and Twitter. Once, you are able to connect your site correctly, it will become easier for you to control the tag lines and images that will appear on social media platforms. These tags make content more catchy on social media feed and will also tell you what your content is about in seconds.

Validate The Information

Make sure all the information that is displayed on Facebook and Twitter is correct and displayed in a nice manner before people start sharing it. 

Communicating With The Search Engines Sitemap

The Sitemap module gives you an overview of your site and displays the RSS feeds for all the categories on your website.

XML Sitemap

This module is a directory of your website which your website definitely needs! XML Sitemap module makes it easy for Google to crawl and index the website. This module provides you the flexibility to include or exclude some pages from the sitemap of your website.

Simple XML Sitemap

This module is a replacement of XML sitemap for Drupal 8 but it has a different function from XML Sitemap. Simple XML Sitemap is a simple module with almost no bugs. You can use this for a newer sitemap standard and more powerful standard.


This is a task scheduler that helps in automatically executing tasks and does not use any manual involvement. Cron always keeps running in the background unless all the jobs are set to off.  

If you want your XML Sitemap to stay updated and clean, Cron will do it for you by checking for updates and rebuilding XML sitemap.

Google Analytics

The Google analytics module tracks what type of links are tracked and what files are downloaded from your pages. It provides Site search and Adsense support. Changing the URL fragments can be tracked as page views. 

Editing Essentials Linkit

This module provides an easy interface of internal and external linking by using an autocomplete field. The Linkit module is your solution to internal linking and has a user friendly UI.

Editor Advance Links

This module checks that all the webpages have a unique title, class, ID, logo and primary image.

Speed And Security

Speed and security are the two main factors that affect the SEO ranking of your webpage. Sites that are quick and secure get rewarded by Google. A faster site will always rank higher than the others! 

Headline, Logo and Primary Image

The first thing that a visitor notices on your website is the home page. So, it is important to keep the home page clean with three essential elements. First up is the logo. It is important to have the logo on the homepage to make sure that the user identifies your brand the next time he notices your logo somewhere. The second one would be the Headline. A creative headline that defines your brand. And the third would be the Primary image along with the headline. 

Enable Search 404 module

This module helps in improving your website's overall experience. When you search for something on the web, this Drupal module gives you search results related to your query instead of showing ‘page not found’. 

SEO Trends 2020 Domain Authority

Domain authority is a website ranking score that predicts how well a website will rank on website results. This score is used when you want to track the strength of your website. 

Google estimates expertise, authoritativeness and trustworthiness which is also called E-A-T. E-A-T is a predominant ranking factor. Google wants to feature content that is written by experts in their field.

Google focuses on the off site signal to figure out your site’s E-A-T. After creating an amazing website, you need to make sure that it is cited and mentioned by other trusted websites and reliable sources. The mentions don’t just mean link mentions, they can be anything like a simple tweet talking about your blog. Expert mentions matter a lot and help you gain a better reputation in Google’s eyes.

Voice Search

OK Google, how do you think voice search will affect my website’s SEO?

As per the OC&C Strategy Consultants, 13% of all the houses in 2018 had a smart speaker and this number is expected to rise to 55% in 2022. 

Source: Wordstream

Google tends to give you the voice search answer from the top 3 results. After searching, Google chooses the page that usually has both the question and the answer. This is one of the reasons that make FAQ pages important for your websites’ search engine rankings. Google tends to use the featured snippets. 

Featured Snippets

Featured snippets are the results that are featured on the top of Google’s search results. These snippets aim to answer the query of the user right away. If your content is featured as a snippet in search results by Google, it means that you are getting additional brand exposure. 

There are three types of featured snippets: Paragraph, List and Table. According to GetStat the most featured snippet out of the three is the ‘paragraph’ type’.

According to Ahrefs reports on 2 million featured snippets, 12.29% of their search results had featured snippets.

Source: AhrefVisual Search

Visual search aims to impact the future of SEO, let’s have a look how. 

Like voice search, visual search begins with taking a picture. Google Lens has been used 1 billion times till now. Apps like pinterest get almost 6000 million visual searches per month.

Visual search has been useful for shopping, translation, recipes, identification of plants, product and a lot more.

With so many apps using this feature, it would not be very surprising to see it rise as an essential part of search. So if you want your product to show up as a visual search result, Image SEO is the right answer! 

Video SEO

According to Cisco, around 80% of online traffic will be made by online video. Even though there are uncountable videos on the internet, Hubspot says that 43% of people want more video content.

Source: TechcrunchSearch Intent

If you want to rank your website, you need to understand to create content with search intent in mind. Search intent is why a search was made, what was the intent behind the search. Were they looking for a purchase? Or any information?

So your first intent should be choosing the right keyword. If you want to rank the highest in the search results, you need to be the most apropos result for the query. 

Click-Through Rate

It is an important concept in Search engine marketing. Getting more people to see your ad or snippet and getting them to click your ad will improve the success rate for your website.

It is important that your meta tag descriptions are 100% original and will make users click on your website. 


Building backlinks is still the most effective way to enhance SEO rankings and traffic. Which will require amazing content. If you want other websites to mention your link on their websites, you must have some quality content to deliver! 

According to a study by Ahref, top ranking pages tend to get followed backlinks.

Source: AhrefResearch Content

Bloggers, journalists and writers love data. It is important to show some stats, survey or industry study so that the users stick to your website for future references.

This is a lot of work and requires a lot of time but it’s always worth it! 

Comments And Community

Google wants to make sure that you have an active website with an active community and that’s what makes comments important for you. Comment can play a great role in the overall quality and ranking of the webpage. Although Google doesn’t have any kind of negative effect on your websites if you don’t have community involvement. 

Zombie Pages

Zombie pages, as the name suggests are the pages that are more or less like the zombies that we see in the movies. These pages are neither live nor dead. They don’t give any purpose to the website.

A few Zombie pages are never a problem until they turn into hundreds of them and pull down your entire site’s SEO. These pages can demean all your efforts on taking your page rank higher.

The first step to get rid of these pages is to identify. You need to delete the HTML files and get rid of them from XML Sitemap. 


The foundation of any SEO is quality content and links. You just need to have the right modules and the right guidance. Drupal is the best choice when it comes to content management systems.

Every year new trends keep coming and change the way information appears on the internet ensuring that the users get the required information in a matter of minutes! And if you follow these ongoing trends, your site will definitely see a boost in its search engine rankings.

OpenSense Labs has always been keen to provide the best solutions for its clients and prospects. To improve your presence online reach us out at

blog banner blog image drupal seo SEO Drupal module Search Engine Optimisation Blog Type Articles Is it a good read ? On

Kalamuna Blog: We stand against injustice and inequality. And, we’re here to do the work.

Thu, 2020/06/11 - 11:09pm
We stand against injustice and inequality. And, we’re here to do the work. Andrew Mallis Thu, 06/11/2020 - 14:09

George Floyd’s murder has deservedly sparked the flame of social unrest. Rodney King, Oscar Grant, Ahmaud Arbery, Breonna Taylor,  and countless more cases known and unknown continue to highlight the systemic and institutional racism that American capitalism depends upon.

Our very participation in the systems that breed oppression make us complicit. Society must change. Lives are at stake.

Categories Community Nonprofits Author Andrew Mallis