The idea to upgrade a website from Drupal 7 to Drupal 8/9 is getting new perspectives.
Now that Drupal 9 is there as an official release, it’s clear like never before that Drupal 7 is getting outdated. It’s time to upgrade Drupal 7 sites so they can give their owners a much better value. The Drudesk support team knows how to achieve this through upgrades and updates, as well as speed optimization, bug fixes, redesign, and so on.
OpenLucius: Update OpenLucius | Create your own social network with 'Posts' (and 3 other new features)
We collected user feedback from which new functions were built, we also improved existing features and design. 4 highlights:1. Social network posts
You can now use Lucius as a social network, instantly: let everybody place Posts to share what is happening and create interaction with inline comments and likes.
Build community and culture out-of-the-box:
Image credit: Aaron Deutsch
DrupalCon Global 2020 is in a couple weeks and there are a lot of amazing sessions. Hope you can make it! While preparing my own DrupalCon Global session, I reviewed the other sessions and made a list of ones you might want to watch to help you prepare for upgrading from Drupal 6 or 7 or 8 to Drupal 9.
In some cases, it was very hard to choose just one on a particular topic. For example, there are 3 great layout builder talks! So, while these are some of my top picks, don't forget to check out all the DrupalCon Global sessions and add your favorites to your schedule.
If you are upgrading from Drupal 8 to Drupal 9 and don't need to make any website improvements, then you can focus on the Drupal 9 sessions. If you will be doing a redesign and/or upgrading from Drupal 6 or 7 to Drupal 9, check sessions for dreaming up your new site, planning your new site architecture, and implementing your new site.
Having devoured DevOps books for the past year and a half, it was interesting to find that some of them contained the terms Infrastructure as Code and Continuous Deployment. It was amazing to learn from the DevOps Handbook that companies were already doing these things in the early 2000s.
Specbee: How to manage Google Ads by integrating DFP (DoubleClick for Publishers) with your Drupal 9 website
As a publisher, it is especially important for you to get the most of what is offered when it comes to pay-per-click advertising. While selling or buying ads in the ad space can be complex, Google allows granular control over all your ads and configurations through Google Ad Manager. Combine this incredible ad management platform with Drupal 9's easy to use integration methods, and you will be able to manage multiple ads on your site while providing insightful reports for better optimization.What is DFP?
Google Ad Manager, previously known as Doubleclick for Publishers (DFP) is an ad server which helps the individuals or business with a good number of page views to generate revenue from their pages on the site. This ad platform facilitates both the buying and selling of ads across various ad networks and multiple locations. Google offers its ad server in two variants – Google Ad Manager for Small Business (completely free) and Google Ad Manager 360. It should be noted that the small business offering has some limited features but works well for small to medium-sized businesses.How to configure Google Ad Manager
1. Creating Ad Units - Ad Units are basic components of the Ad Manager. It defines the size of the ad and specific location of the Ad on your website or app where you want to display the ads.
Below is a sample screenshot of an Ad unit configured on the Ad manager account.
2. Delivering Ad unit - For delivering corresponding add units, we need to add Orders, Line items & Creatives.
• The Orders in Google Ad manager, where we need to add advertiser & trafficker, in other word we can say if company A wants to buy ad space in our site, the first step to setting them up is to create their order in our google Ad manager account as all subsequent line items within this order.
• Now create line item which holds information about the specific run dates, targeting, and pricing of one or more creatives.
• A creative is a specific advertisement, such as an image file, a video file, or other content. One creative can be associated with more than one line item.
1. First, install and enable the DFP module.
2. Under structure menu, go to DFP Add tags. We have to set Network ID (prepending with “/“ (eg: /111111) in global DFP settings tab which we will get from the Google Ad manager account. Save the configuration.
3. Fill up the following details in the Add DFP tag form.
Ad Slot Name → Use the same label of Ad Unit configured in Google Ad manager account
Size(s) → Copy the same sizes of Ad Unit configured in Google Ad manager account
Ad Unit Pattern → Copy the exact pattern from the “Code“ of Ad unit configured in Google Ad manager account
Under “Display Options”, make sure “Create a block for this ad tag“ is checked.
4. Save the Form. This will create a block with the required ad script.
5. Place the Block wherever it is required using either Structure / Block layout (for all pages).
The Drupal DFP integration allows website publishers and builders to integrate their Google ad manager accounts with their Drupal 9 website. To learn more about how we can help you leverage the best of Drupal and its modules, please feel free to connect with us.
Leave us a CommentShefali ShettyApr 05, 2017 Recent Posts Image How to manage Google Ads by integrating DFP (DoubleClick for Publishers) with your Drupal 9 website Image Functions and filters to get you started with Twig Tweak in Drupal 8 (with examples) Image An Introduction to Headless or Decoupled CMS in AEM 6.5 Want to extract the maximum out of Drupal? TALK TO US Featured Success Stories
Know more about our technology driven approach to recreate the content management workflow for 7.ailink
Find out how we transformed the digital image of world’s largest healthcare provider, an attribute that defined their global presence in the medical world.link
Discover how a Drupal powered internal portal encouraged the sellers at Flipkart to obtain the latest insights with respect to a particular domain.link
As Europe starts to reopen its borders, we feel like it’s time to give you an update about where we are now and how you can get ready for the Drupal community fun in the best and easiest way.
This past Sunday I did a livestream where I tried out the new issue forks functionality on Drupal.org. The feature is currently beta and project maintainers need to opt-in for the functionality. At the moment you can create a repository fork for an issue, create a work branch, and commit to that branch. You will need to manually create a patch to be uploaded, as Merge Requests with Drupal.org issues are still being worked on (the Drupal Association Infrastructure team is hoping to have this available for DrupalCon Global!)
When you view an issue that supports issue forks, there will be a new block in the sidebar. This allows you to create an issue fork or view information about the existing fork and its branches.
Agiledrop.com Blog: Customer Experience, User Experience & Digital Experience: Basics & useful terms
In this blog post, we’ll take a look at some different terms related to experience in the digital landscape: Customer Experience, User Experience and Digital Experience. We'll explain each of them, discuss the differences and connections between them, and provide a short glossary of useful terms.READ MORE
Doing this for a particular order is pretty straightforward. All you need to have is the coupon code ID:superuser Mon, 29/06/2020 - 09:04
The Drupal migrations, despite their linearity in terms of definitions, contain a lot of inherited complexity. The reason is very intuitive: although the Migrate API is a supersystem that offers a very simple “interface” of interactions for the user-developer who wants to build migration processes, in reality several subsystems work by interacting with each other throughout a migration process: Entities, Database, Plugins…There are a lot of classes involved in even the simplest migration process. If we add the irrefutable fact that a migration will tend to generate errors in …
Waaaaay back in 2013, I wrote a blog post about importing and mapping over 5,000 points of interest in 45 minutes using (mainly) the Feeds and Geofield modules. Before that, I had also done Drupal 6 demos of importing and displaying earthquake data.
With the recent release of Drupal 9, I figured it was time for a modern take on the idea - this time using the Drupal migration system as well as (still!) Geofield.
This time, for the source data, I found a .csv file of 814 lighthouses in the United States that I downloaded from POI Factory (which also appears to be a Drupal site).Starting point
First, start with a fresh Drupal 9.0.1 site installed using the drupal/recommended-project Composer template. Then, use Composer to require Drush and the following modules:
Then, enable the modules usingdrush en -y migrate_plus migrate_tools migrate_source_csv geofield geofield_map leaflet Overview of approach
To achieve the goal of importing all 814 lighthouses and displaying them on a map, we're going to import the .csv file using the migration system into a new content type that includes a Geofield configured with a formatter that displays a map (powered by Leaflet).
The source data (.csv file) contains the following fields:
So, our tasks will be:
- Create a new "lighthouse" content type with a "Location" field of type Geofield that has a map formatter (via Geofield map).
- Prepare the .csv file.
- Create a migration that reads the .csv file and creates new nodes of type "Lighthouse".
We will reuse the Drupal title and body field for the Lighthouse .csv's Name and Description fields.
Then, all we need to add is a new Geofield location field for the longitude and latitude:
Next, we'll test out the new Lighthouse content type by manually creating a new node from the data in the .csv file. This will also be helpful as we configure the Geofield map field formatter (using Leaflet).
By default, a Geofield field uses the "Raw output" formatter. With Leaflet installed and enabled, we can utilize the "Leaflet map" formatter (with the default configuration options).
With this minor change, our test Lighthouse node now displays a map!
Prior to writing a migration for any .csv file, it is advised to review the file to ensure it will be easy to migrate (and rollback). Two things are very important:
- Column names
- Unique identifier
Column names help in mapping .csv fields to Drupal fields while a unique identifier helps with migration rollbacks. While the unique identifier can be a combination of multiple fields, I find it easiest to add my own when it makes sense.
The initial .csv file looks like this (opened in a spreadsheet):
In the case of the lighthouse .csv file in this example, it has neither column names nor a unique identifier field. To rectify this, open the .csv as a spreadsheet and add both. For the unique identifier field, I prefer a simple integer field.
Once manually updated, it looks like this:Create the migration
If you've never used the Drupal 8/9 migration system before it can be intimidating, but at its heart, it is basically just a tool that:
- Reads source data
- Maps source data to the destination
- Creates the destination
Writing your first migration is a big step, so let's get started.
The first step is to create a new custom module to house the migration. First, create a new, empty web/modules/custom/ directory. Then, easily create the module's scaffolding with Drush's "generate" command:$ drush generate module Welcome to module-standard generator! ––––––––––––––––––––––––––––––––––––––– Module name: ➤ Lighthouse importer Module machine name [lighthouse_importer]: ➤ Module description [The description.]: ➤ Module for importing lighthouses from .csv file. Package [Custom]: ➤ DrupalEasy Dependencies (comma separated): ➤ migrate_plus, migrate_source_csv, geofield Would you like to create install file? [Yes]: ➤ No Would you like to create libraries.yml file? [Yes]: ➤ No Would you like to create permissions.yml file? [Yes]: ➤ No Would you like to create event subscriber? [Yes]: ➤ No Would you like to create block plugin? [Yes]: ➤ No Would you like to create a controller? [Yes]: ➤ No Would you like to create settings form? [Yes]: ➤ No The following directories and files have been created or updated: ––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––– • modules/lighthouse_importer/lighthouse_importer.info.yml • modules/lighthouse_importer/lighthouse_importer.module
Then, let's create a new web/modules/custom/lighthosue_importer/data/ directory and move the updated .csv file into it - in my case, I named it Lighthouses-USA-updated.csv.
Next, we need to create the lighthouse migration's configuration - this is done in a .yml file that will be located at web/modules/custom/lighthouse_importer/config/install/migrate_plus.migration.lighthouses.yml
The resulting module's file structure looks like this:web/sites/modules/custom/lighthouse_importer/ config/ install/ migrate_plus.migration.lighthouses.yml data/ Lighthouses-USA-updated.csv lighthouse_importer.info.yml lighthouse_importer.module
Note that the lighthouse_importer.module, created by Drush, is empty.
While there are a couple of ways to create the migration configuration, we're going to leverage the Migrate Plus module.
For more information about writing migrations using code or configurations, check out this blog post from UnderstandDrupal.com.
One of the big hurdles of learning to write Drupal migrations is figuring out where to start. It doesn't make much sense to write the migrate_plus.migration.lighthouses.yml from scratch; most experienced migrators start with an existing migration and tailor it to their needs. In this case, we'll start with the core Drupal 7 node migration (web/core/modules/node/migrations/d7_node.yml)
Let's break up the configuration of the new lighthouse migration into three parts:
- Everything before the "process" section.
- Everything after the "process" section.
- The "process" section.
Our starting point (d7_node.yml) looks like this:
Let's update it to look like this:id: lighthouses label: Lighthouses source: plugin: 'csv' path: '/var/www/html/web/modules/custom/lighthouse_importer/data/Lighthouses-USA-updated.csv' ids: - ID fields: 0: name: ID label: 'Unique Id' 1: name: Lon label: 'Longitude' 2: name: Lat label: 'Latitude' 3: name: Name label: 'Name' 4: name: Description label: 'Description'
The main difference is the definition of the "source". In our case, since we're using a .csv as our source data, we have to fully define it for the migration. The Migrate Source CSV module documentation is very helpful in this situation.
Note that the "path" value is absolute.
The "ids" section informs the migration system which field(s) is the unique identifier for each record.
The "fields" section lists all of the fields in the .csv file (in order) so that they are available (via their "name") to the migration.Everything after the "process" section
This is often the easiest part of the migration configuration system to write. Often, we just have to define what type of entity the migration will be creating as well as any dependencies. In this example, we'll be creating nodes and we don't have any dependencies. So, the entire section looks like this:destination: plugin: entity:node The "process" section
This is where the magic happens - in this section we map the source data to the destination fields. The format is destination_value: source_value.
As we aren't migrating data from another Drupal site, we don't need the nid nor vid fields - we'll let Drupal create new node and revision identifiers as we go.
As we don't have much source data, we'll have to set several default values for some of the fields Drupal is expecting. Others we can just ignore and let Drupal set its own default values.
Starting with the just the mapping from the d7_node.yaml, we can modify it to:process: langcode: plugin: default_value source: language default_value: "und" title: Name uid: plugin: default_value default_value: 1 status: plugin: default_value default_value: 1
Note that we set the default language to "und" (undefined) and the default author to UID=1 and status to 1 (published). The only actual source data we're mapping to the destination (so far) is the "Name", which we are mapping to the node title.
One thing that is definitely missing at this point is the "type" (content type) of node we want the migration to create. We'll add a "type" mapping to the "process" section with a default value of "lighthouse".
We have three additional fields from the source data that we want to import into Drupal: longitude, latitude, and the description. Luckily, the Geofield module includes a migration processor, which allows us to provide it with the longitude and latitude values and it does the dirty work of preparing the data for the Geofield. For the Description, we'll just map it directly to the node's "body/value" field and let Drupal use the default "body/format" value ("Basic HTML").
So, the resulting process section looks like:process: langcode: plugin: default_value source: language default_value: "und" title: Name uid: plugin: default_value default_value: 1 status: plugin: default_value default_value: 1 type: plugin: default_value default_value: lighthouse field_location: plugin: geofield_latlon source: - Lat - Lon body/value: Description
Once complete, enable the module usingdrush en -y lighthouse_importer
It is important to note that as we are creating this migration using a Migrate Plus configuration entity, the configuration in the migrate_plus.migration.lighthouses.yml is only imported into the site's "active configuration" when the module is enabled. This is often less-than-ideal as this means every time you make a change to the migration's .yml, you need to uninstall and then re-enable the module for the updated migration to be imported. The Config devel module is often used to automatically import config changes on every page load. Note that this module is normally for local use only - it should never be used in a production environment. As of the authoring of this blog post, the patch to make Config Devel compatible with Drupal 9 is RTBC. In the meantime, you can use the following to update the active config each time you make a change to your lighthouses migration configuration:drush config-delete migrate_plus.migration.lighthouses -y && drush pm-uninstall lighthouse_importer -y && drush en -y lighthouse_importer Testing and running the migration
Use the migrate-status (ms) command (provided by the Migrate Tools module) to check the status of our migration:$ drush ms lighthouses ------------------- -------------- -------- ------- ---------- ------------- --------------- Group Migration ID Status Total Imported Unprocessed Last Imported ------------------- -------------- -------- ------- ---------- ------------- --------------- Default (default) lighthouses Idle 814 0 814 ------------------- -------------- -------- ------- ---------- ------------- ---------------
If everything looks okay, then let's run the first 5 rows of the migration using the migrate-import (mim) command:$ drush mim lighthouses --limit=5 [notice] Processed 5 items (5 created, 0 updated, 0 failed, 0 ignored) - done with 'lighthouses'
Confirm the migration by viewing your new nodes of type "lighthouse"!
If all looks good, run the rest of the migration by leaving out the --limit=5 bit:$ drush mim lighthouses [notice] Processed 804 items (804 created, 0 updated, 0 failed, 0 ignored) - done with 'lighthouses'
If you don't like the results, then you can rollback the migration using "drush migrate-rollback lighthouses" (or "drush mr lighthouses"), make your changes, update the active config, and re-import.Next steps
There's a lot more to the Drupal migration system, but hopefully this example will help instill some confidence in you for creating your own migrations.
The "Leaflet Views" module (included with Leaflet) makes it easy to create a view that shows all imported lighthouses on a single map (see the image at the top of the article). Once you have the data imported, there's so much that you can do!
- Drupal 7 EOL extended
- DrupalCon Global Scholarship Mentors
- Examples for Developers now includes REST examples
- The maintainers initiative
- Drupal 9
- Upgrade Status module
- Google Data Studio
- Philip Pullman
- The Golden Compass (Northern Lights
- Composer Basics online workshop - 7-hour (split over 2 days) online workshop - Monday, July 20 from 1:30-5pm ET (part 1) and Tuesday, July 21 from 1:30-5pm ET (part 2).
- Professional local development with DDEV - 2-hour, hands-on, online workshop held monthly (Tuesday, July 7).
- Local Web Development with DDEV Explained.
- Drupal Career Online - next semester begins August 31.
If you'd like to leave us a voicemail, call 321-396-2340. Please keep in mind that we might play your voicemail during one of our future podcasts. Feel free to call in with suggestions, rants, questions, or corrections. If you'd rather just send us an email, please use our contact page.
Srijan Technologies: Mechanics of Editorial Experience in Digital World vis-a-vis Ergonomics of Manufacturing Industry
Consider a situation wherein your car indicators are placed near the glove compartment, the horn near the back seat, ignition turn on/off button near the fuel tank, and steering wheel with the button to open the side doors. How infeasible it would be!
OpenLucius: Drupal how-to: redirect anonymous users to login page | A working example module (Drupal 9 compatible)
For our Drupal distribution we needed to redirect all anonymous users to the login page, as for now it's implemented as a closed platform for social collaboration. The Drupal 8 way didn't work anymore; we fixed it for Drupal 9 and published a working module.
So if you're building a Drupal social intranet, collaboration tool or community this might help you to direct them the right way -so they don't get an unfriendly 'access denied'.
Keep in mind that you still have to build the correct access control into all your pages with help of permissions / access checks / advanced route access, this module doesn't provide for that.