At the beginning of April, the Drupal Association announced a new #DrupalCares campaign to secure funding to keep the Association's lights on after DrupalCon Minneapolis was mothballed due to certain global events.
Very quickly, many in the Drupal community stepped up, increasing contributions, making one-time donations, or even pledging a generous 2-for-1 match. I decided to pledge $1 for every like on this video, and as of today, it had over 800 likes!
A project manager’s guide to making your next site Drupal 9 ready
Drupal Rector is a tool that’s designed to help automate Drupal code upgrades. While most of the information to date about using Drupal Rector has been written from a developer perspective, in this post, we’ll put on our project (or product) manager hats and see what it means for your project.
First, let’s set a common scenario:
- As a project manager, you are responsible for scheduling maintenance and upgrades to your portfolio of Drupal sites and applications.
- Staying on Drupal 8 introduces project risks once we enter 2021.
- Moving to Drupal 9 compatibility as soon as possible guarantees the best return on your investment in the platform.
With these factors in mind, what does Rector mean for your project? Let’s take a look:
- Rector is a set of tools for automatically updating the PHP code that runs your Drupal site.
- Drupal Rector is a set of tools specifically targeted at Drupal 9 compatibility.
- Your development or support team can use Drupal Rector to help prepare your site for Drupal 9.
- You can start today, getting in front of future risk.
- Your team can contribute to the health and stability of the Drupal project by contributing to Drupal Rector.
What might a project plan look like? First, look at the resources you have. If you have developers, they can take one or more of the following steps:
- Create new Rector rules for use by all Drupal developers
- Use Drupal Check or Upgrade Status to get a list of deprecations in your current code.
- Use Drupal Rector to apply updates to your custom code.
- Drupal Rector can generate code updates for your custom project code, saving developer time. While not all issues can be fixed automatically, more than half of modules tested can be fixed simply with Rector.
At Palantir, we have focused on step 1 in order to enable our Managed Support team to perform steps 2 and 3. We work in Kanban sprints with a small team, and we have seen measurable progress in as little as a week.
Taking such a proactive approach as a project manager is a great way to ensure continued project success. As Drupal 8 will likely no longer receive community support after Q4 2021, taking steps to support Drupal 9 now will save your organization time and money in the future. It is also a great opportunity to contribute to the Drupal project and get organizational credit for doing so.Development Drupal Open Source Project Management Support
InternetDevels: Drupal integration with Salesforce to improve sales & marketing: overview + case study
CRMs are helpful assistants for businesses in their communication with current and prospective customers. No wonder that many business owners want to integrate CRM software with their sites.Read more
Open source as a concept has been on the radar of the software community for many years now, but in many ways it is only just starting to gain steam among the enterprise organizations and business leaders that are seeking new ways to ensure the longevity of the solutions and architectures they build. At its core, open source is about more than just software; it’s about the community that surrounds it. Leveraging and contributing back to open source can yield dividends not only for businesses searching for more robust technologies but also for our own careers and futures.Read more preston Wed, 04/29/2020 - 05:05
Paragraphs is a popular module that allows you to create components which can be used on an article or basic page for example. It’s a module which I’ve used on all of my projects for the last half-decade.
Instead of an editor writing all the page content in a text editor, a site builder can create a set of paragraph types that the editor can use to create pages.
An example of a paragraph type could be an accordion, slideshow, or any type of complex component.
The Drupal landscape in the last couple of years has changed thanks to Layout Builder, which is a core Drupal module that lets you control and create layouts on entities. Hence the name Layout Builder. If you want to learn more about the module then look at our Getting Started with Layout Builder in Drupal 8 tutorial.
Now, with Layout Builder on the scene. Does it mean that Paragraphs is dead? NO!
The Paragraphs module is great for implementing complex data models, i.e., grouping fields together. I have worked on projects where they have gone a little overboard with the use of Paragraphs and things got pretty messy. I’m talking about 4 levels of nested paragraphs. But if you control the complexity of the paragraph types and don’t have so many nested levels then you should be fine.
I’ve talked about Paragraphs on this site many times in the past. If you’re looking for a basic introduction then look at our Introduction to the Paragraphs Module in Drupal 8 tutorial.
These are the steps I went through on "Drupal 9 porting day" on April 28, 2020, which was initiated by Drupal core "Initiative coordinator coordinator", Gábor Hojtsy. It was focused around timezones: Australia & New Zealand, Europe, and Americas.
Image credit: Aaron Deutsch
When I announced the Drupal 9 module porting challenge two weeks ago, I did not fully understand what was gonna come. I offered to donate €900 to the Drupal Association #DrupalCares campaign for 100 projects newly ported to Drupal 9. Then more funders started to appear. Ron Northcutt offered another €900, Ofer Shaal put in another €450. QED42 offered to match Ron's €900. It certainly grew much bigger than I anticipated so it was time to step up the game.
So last week I announced and started organizing Drupal 9 porting day for April 28, 2020 to not let our funders keep their money. While my funds were almost gone in the first week, there was still the rest of the funds to get donated. The idea of the porting day sounded good because we raise funds for the Drupal Association, we get people together to do their first Drupal 9 releases, we help others' projects out, drive the tools to their boundaries, do Drupal 9 core quality assurance and grow the ready module pool before Drupal 9's launch all at the same time. Some people would learn how to get ready for Drupal 9 for the first time, so we would spread some know-how and confidence in the release as well. That is like a win-win-win-win-win-win.
Nonetheless I was still blown away by the interest to participate. Lee Rowlands and Vladimir Roudakov signed up to start leading porting day in Australia / New Zealand while I was still well asleep. By the time I woke up there were already various new releases and issues opened. I started providing feedback there and then worked my way through the top 50 used projects that needed info file changes and releases. I made sure to do the deepest research and support maintainers to do the next steps. I also started getting patches for my own projects and even though I did not think it would be even feasible, thanks to contributors, we made one of my projects, Upgrade Rector Drupal 9 compatible as well. I also helped fix a critical core bug in Drupal 9 that Christian López Espínola found while porting the Lingotek module suite. At least two companies, QED42 and Srijan had groups of people internally gathering to rally and contribute. In my afternoon, Adam Bergstein and Mike Lutz came in from the United States to continue leading the day onwards.
Closing for the day. We had an awesome contribution effort today. It is incredible to see so many contributors participating and making this day wonderful. ❤️ Thank you… #DrupalCares - Jaideep Singh Kandari
At the time of this writing, altogether 126 issues were worked on. According to my scripts identifying newly Drupal 9 compatible releases of projects, 43 newly Drupal 9 compatible releases were made, including top 50 projects like honeypot and adminimal_admin_toolbar and such developer modules as twig_xdebug and queue_ui. When I put this together with all the numbers in the challenge to date, it turns out these 43 projects exactly rounded out the second 100 projects. Yes I went back to double-check!
This means Ron will now donate his €900 (which will be matched by Dries and Vanessa Buytaert and Drupal businesses to €2700) and QED42 will also donate their €900, totalling to an impact of €3600 funding for the Drupal Association from this second milestone of the Drupal 9 porting challenge. (Including the first milestone's €900, the directly donated funds are altogether €2700, for a total matched impact of €6300 in the #DrupalCares campaign).
If you did not get to do a first Drupal 9 release on porting day, no problem! We made a ton of progress on projects other than the ones that got releases and that will result in more releases. Some of them could be very soon. In fact, this challenge is not over, as there are still two more days, and we just entered the final round for Ofer Shaal's fund of 50 newly Drupal 9 compatible releases (max €450) for #DrupalCares. So please keep the releases coming! Thanks all!
Ps. Kristen Pol wrote up her detailed steps of working on Drupal 9 compatibility of others' projects. I suggest reading her tips for how to ensure compatibility and work with maintainers respectfully.
Having trained thousands of people how to use Drupal since 2011, I've learned a lot about what makes students tick. Most of our classes have been in-person, since it's easier to teach effectively when I can instill enthusiasm by jumping up to the whiteboard, or rush over to help someone who's stuck. In wake of COVID-19 and the rapid transition to online learning, I'm reflecting how to make virtual training as engaging as it can be. I'd like to share four key factors for effective learning that go deeper than which video-conference or chat platform to use.
One of the most essential parts of teaching is motivating the learning. When I teach, I constantly bring up use cases and scenarios to explain why something is the way it is, or how what you're learning will solve a problem when you go back to your desk and are working on a real-world project
Students have less patience when they're interacting with you through a tiny screen. Without in-person presence, it's harder as a teacher to pull out the motivating examples that will bring the material alive. Likewise, students are conditioned to expect higher production values from a YouTube explainer video than a whiteboard explanation. To hold the user's attention in a virtual format, the motivating materials need to be more visual and condensed, and referenced repeatedly.
Nobody likes to feel like they're behind the class, asking "dumb questions". Affirmation that you're in the right place learning the right thing is empowering. Newbie students will repeatedly ask themselves: Am I at the right level? Do I have all the prerequisites? Will I succeed? Being in a room of their peers who are facing the same challenge builds a sense of "I belong" and "I can do this". And it's contagious.
Fostering such affirmation is tougher in an anonymous crowd, so pairing students up or splitting them into small, supportive groups can help. As a trainer, I'm constantly working to understand where every student is at, acknowledging it's okay that they don't know something, and helping them get to the next step. To provide affirmation in a virtual learning environment, we incorporate break-out rooms, office hours, encourage chat between students, reaching out to each student for direct one-on-ones.
Sustaining student attention for several hours demands a lot of effort from the trainer, especially when the virtual training is competing with email and social media. But there are many techniques that can help.
Working through a problem together as a group allows students to switch between active and passive contribution, and take a break without falling behind. It's also satisfying if the exercises can be populated with realistic prepared content to produce a functional web application, perhaps a recipe listing or meme sharing site. And this is a great opportunity for injecting humour or cat photos.
Students tend to feed on the engagement of others. A couple outwardly enthusiastic students can bring a whole classroom to life. With virtual training, small, interactive classes are essential, as well as breaking up the material into shorter, easier-to-digest pieces.
Permission to Play
Every experienced programmer remembers how difficult it was to set up their first development environment. Many people who work with Drupal professionally only have access to Drupal in a corporate development environment that they're afraid to break, and don't know how to clone for themselves. When you're teaching a technical skill that requires experimentation, students need to overcome the fear of breaking things. A small typo or incorrect version of a dependency can cause a "fatal error" and block a student from even trying to get something to work.
One of the keys to success is to set up a learning sandbox that feels real, but that doesn't have the element of "I can't mess this up". Before each coding class, we send out instructions on how to set up a local dev environment using Acquia Dev Desktop, Lando, or WAMP. For a site building class, we usually suggest a cloud Pantheon sandbox environment. Before training starts, the trainer can walk around the room to ensure that everyone has their setup working, and troubleshoot any errors they might have encountered.
I am still figuring out how to best incorporate such "personalized tech support" in an online context, but helping people trouble-shoot their environment and get set up in advance is definitely part of the solution. And tools like screen sharing and Remote Desktop are definitely helpful.In Conclusion
In the coming weeks, our team will be experimenting with new training formats that will incorporate these ideas. We have a whole slew of live, online training sessions coming up. If you have feedback or ideas, specific technical solutions to suggest, or experience to share, we'd love to hear from you! And if you're passionate about teaching and web development, we're hiring for consulting roles in curriculum design, training delivery, and marketing for our training program.+ more awesome articles by Evolving Web
Continuing our short series of articles highlighting ways that the Drupal software and its community are building solutions to help combat the effect of COVID-19, today we hear from Mike Ansley of Colorado Interactive. Here, he describes their project at the Colorado Department of Public Health & Environment and the Colorado Governor's Office.Official State COVID19 Website
Colorado Interactive (CI) hosts, manages, and supports over 300 state and local government websites on its Drupal platform as the official portal integrator for Colorado’s Statewide Internet Portal Authority. In March 2020, Colorado Interactive proactively engaged the Department of Public Health & Environment (CDPHE), the agency leading the State’s COVID19 response. Leveraging Drupal’s flexibility and configurability, the CI team ultimately worked with CDPHE to develop, provision, and launch an entirely new, fully responsive COVID19 website.
The site immediately served as the State’s primary source of updates, aggregated data, and general information. To promote the resource, CI created a custom banner application in Drupal to implement universal COVID-19 messaging and links for its 300+ websites.
CI’s Drupal platform enables CDPHE staff to manage site content and leverage custom design elements to provide the public with clear, uniform information, including the integration of data visualizations from the State’s Tableau server to graphically illustrate the number of cases, by county, age, date, etc. This data visualization site consistently has 67% of the entire site traffic across all hours of the day. At 4:00 pm each day, when CDPHE publishes the latest pandemic data set on the site, traffic grows from roughly 1,500 to tens of thousands of users on the site.
The highest number of users to the site was 99,684 users at 7:00 am on Thursday March 27, shortly after the Colorado Emergency Operations center included the site’s url in the message to announce the State’s Stay at Home Executive Order via the Emergency Alert System (distributing cell-phone alerts to statewide residents). Over its initial two weeks, there have been over 12 million page views and over 2 million user visits to the COVID-19 website, with average visit duration exceeding 2 minutes. In the initial month, over 3.6 million users visited the website.Governor’s Stay-At-Home Resource Website
Colorado Interactive (CI) hosts, manages, and supports over 300 state and local government websites on its Drupal platform as the official portal integrator for Colorado’s Statewide Internet Portal Authority. CI developed and provisioned the State’s official COVID19 resource, a Drupal website for the Colorado Department of Public Health & Environment in March. In early April, immediately following the Governor of Colorado’s statewide Stay-At-Home Executive Order, CI responded to the Governor’s Office’s urgent request for a new website to aggregate resources for Coloradoans during the Order.
CI led a collaboration among the Governor’s Office, the Governor’s Innovative Response Team (IRT), and volunteers from the Citizen Software Engagement Group (CSEG) to design and publish an entirely new, fully responsive Drupal 8 website in less than 48 hours. The site features an online directory of services and resources available to Coloradoans as they face an extended period of a stay-at-home order. Visitors of the site can utilize extended predefined lists and key term search functionality to find resources. Via an integration with CI’s application tool, businesses and organizations can submit an online form to be included in the public-facing directory. The CI team also designed a new theme, including custom blocks to showcase highlight calls-to-action, drive ease of use, and accommodate easy admin management. The Governor’s Office, as well as a variety of public and private sector stakeholders, have since acknowledged CI’s Drupal website as a valuable resource in helping Coloradoans during the crisis.
Today it is your turn.
After every major release of Drupal, we distribute a survey to get the community's feedback on what to focus on for future releases of Drupal.
The last time we conducted such a survey was four years ago, after the release of Drupal 8. The results were the basis for defining Drupal 8 product initiatives, organized on the Drupal 8 mountain that I've been using for the past few years.An example result from the 2016 Drupal product survey. The result shows that in 2016 we decided to focus on "content authors" as the most important persona. Since that survey, we improved Drupal's authoring workflows, media management, layout building, and more.
In a similar way, this new survey will help lay the groundwork for the strategic direction of the Drupal project over the next 3–5 years.
👉Take the 2020 Drupal product survey here. The survey takes about 15 minutes to complete.
We'd like to hear from everyone who cares about Drupal: content managers, site owners, site builders, module developers, front-end developers, people selling Drupal, and anyone else who touches the Drupal project in any capacity. Whether you are a Drupal expert or just getting started with Drupal, every voice counts!
I will be presenting the results at the next DrupalCon, as well as sharing the results on my blog. DrupalCon North America was originally planned to be in Minneapolis at the end of May, but because of COVID-19, it has been moved to a global, virtual DrupalCon in July.. Join us at DrupalCon, or if you can't, considering subscribing to my blog to get the results via email. I look forward to hearing from you about how we should create the future of Drupal together.
To create the survey, I worked with Brian Prue, Gábor Hojtsy, Angela Byron and Matthew Grasmick. Initial versions were improved based on test runs with over a dozen people in the Drupal community. Thank you for all their feedback along the way.
Combining the powers of Apache Solr and Drupal 8 results in unmatched digital experiences with high-performing, enterprise-level search features and functionality. In this article, we will learn about why Apache Solr should be chosen and how we can configure Apache Solr in Drupal 8.What is Apache Solr?
Solr is a solid and ascendable open-source search platform that provides distributed indexing and load-balanced querying. Built initially for and by CNET Networks, this Java based project was later offered to the Apache Software foundation. Apache Solr is the best solution for super-fast, credible and awesome search applications. Big guns like Netflix, Instagram and Twitter including various e-commerce sites and CMSs, use Apache Solr for their search functionality.Why choose Apache Solr?
With many options available in Drupal 8 core to implement your search functionalities and features, why should you choose Apache Solr with Drupal 8? Here are some reasons Apache Solr might be the best fit for your project:
- Apache Solr offers Faceted navigation to let users add multiple filters that can help them navigate easily through piles of information. Facets are elements for navigation that can be queried.
- It allows for Full-text searches that offers precise results along with its near real-time indexing and searching capabilities. Indexing with Apache Solr is not only faster, it can also be merged and further optimized.
- The Hit Highlight feature enables highlighting the search words or phrases to make it easy to identify.
- The Dynamic clustering feature allows to group search results and offer related searches or recommendations.
- Allows spell check and auto-complete suggestions for a better search experience.
Using Apache Solr with Drupal 8 enables better control over your website search and offers an interactive admin interface. Check out how we enabled a leading Healthcare provider to boost their search experience with Apache Solr and Drupal.Implementing Apache Solr in Drupal 8
Let us divide this process into the following parts:
- Install Apache Solr
- Install the Drupal Solr Module
- Configure Apache Solr with Drupal Module
As Apache Solr is completely based on Java, we need to install Java to begin with. For Apache Solr 7, we need Java 8 or higher to run. If you don't have Java installed in your system, install Java using the below command –$sudo apt install openjdk-11-jdk
Verify the active Java version using the below command -$java -version Step 2: Install Apache Solr on Ubuntu
Now, you can download the latest Apache Solr version from its official site. Else, you can also use below command –$cd /opt $wget https://archive.apache.org/dist/lucene/solr/7.7.2/solr-7.7.2.tgz
Now, extract Apache Solr service installer shell script from the downloaded Solr archive file and run the installer using the following commands.
$tar xzf solr-7.7.2.tgz solr-7.7.2/bin/install_solr_service.sh --strip-components=2
$sudo bash ./install_solr_service.sh solr-7.7.2.tgz
Now, Solr should be installed on your system. You can use these commands to Start, Stop and check the status of Solr service –$sudo service solr stop $sudo service solr start $sudo service solr status
Default Solr runs on port 8983. You can access your Solr admin panel by typing localhost:8983 in your browser.2. Installing the Solr module in Drupal 8
The Drupal 8 Search API Solr module should be installed before we go any further. This module integrates Drupal with the Apache Solr search platform and provides a Solr backend for the Search API module.
Install this module in your site using Composer using the below command –$composer require drupal/search_api_solr
Once it is done, enable the module.Go to Extend → Search Search API Solr Search → Enable it 3. Configure Apache Solr with the Drupal Search API Solr Module Step 1: Create a core in Apache Solr
After Installation of Solr, you need to create a core in the Solr in order to work with the Solr platform. This is an important step where we can index the contents to the Solr core. You can see the indexed content in the core which you have created. You can create the core using the below command in ubuntu –
$sudo su - solr -c "/opt/solr/bin/solr create -c first_solr_core -n data_driven_schema_configs"
This command will differ depending on where Solr is installed. Here, it is in the /opt folder of my Ubuntu system. You can see the created core in the Solr admin panel.Step 2: Create a Solr Server
In this step you need to create a Solr server and index in your Drupal website. To create a Solr server –
Go to Configuration -> Search And Metadata -> Search API -> click on Add server
When you click on the Add server you will get one form you need to fill that form
- Server name: Enter the Server name
- Enable: You need to check the enable check box. If you don’t enable it, you cannot index the items
- Add server description: A brief description of the server.
Next, you need to configure the Solr backend. Click on the CONFIGURE SOLR BACKEND in the form. You can see the form elements as shown in the image below:
Solr Connector: There are four connectors available. You need to select a connector to use for this Solr server. I am using the Standard connector.
HTTP protocol: Choose https or Http (depending if your server uses SSL or not).
Solr host: localhost (if your Solr Server is on a different machine, please enter the IP or hostname of that host here)
Solr port: 8983 (This is Default port)
Solr path: / (In this case Solr path is "/solr" ).
Solr core: Enter the Ssolr core name which you created before.
The values as defined default work fine. However, you can also change them accordingly.
Here we are going to create a Search API index that will index the Datasources that you selected in the index. To create an index –
Go to Configuration -> Search and Metadata -> Search API -> Add index
Index name: Enter an index name.
Data sources: Here you need select the Datasources to index the items for the search functionality. In my example, I selected content. When you select the content, you will get the option to select the bundles in the CONFIGURE THE CONTENT DATASOURCE. Here you need to select the bundles you want to index.
Server: Select the server to index the items. I have selected the Apache-solr-server.
Enabled: This enables the index. Don’t forget to enable the selected server as well for this to work.
Once done, hit Save and add the field.
After successfully creating an index, you now need to add fields to the index.
To add the field that you need, go back to the search API, click on Edit the index which you have created.
You will then see the Fields tab. Click on the fields tab and you will see a window as shown in the image below.
Next, click on the Add fields. In the popup that appears, you need to add the fields you need. After adding the fields, click on Done and then Save.
Now, the Index and server are created on the website. After this, the next Important step is copy the configuration files to Solr core. For this, you need download the config files from the server in the website.
Go to Configuration -> Search and Metadata -> Search API ->
You will get a window as shown in the image below –
Next, click on the “Get config.zip” button. This will download the config folder. Once done, extract the zip folder and copy all the files into the conf folder in the core.
To copy the files, open the terminal and move to the Solr core folder. In Ubuntu, Solr core folder will be stored in the /var/solr/data. In the data folder, you will see the Solr core folder which you had created earlier. Go to that Solr core folder and you will find a folder with the name conf. in this folder you will find some files. Delete those files and copy all the files from the config folder that is downloaded from the website server. Once done, you need to restart your Solr server in the terminal.
Next step is to index the contents –
Go to Configuration -> Search and Metadata -> Search API -> click on the index that you have created. You will see a window as shown in the image below.
Next, click on the index to index the contents. Now all the contents are indexed in the Solr core. Now you can create a view of Solr index and add the indexed fields to the view to get the results from the Drupal 8 Solr.
Apache Solr is an extremely flexible and robust search engine that can be integrated with Drupal 8 to create amazing search experiences. At Specbee, our Drupal developers have the expertise and the experience to help you create compelling Drupal web experiences by leveraging powerful features. Contact us now to talk to us about your next Drupal project.Drupal Planet Shefali ShettyApr 05, 2017 Subscribe For Our Newsletter And Stay Updated Subscribe
Leave us a CommentShefali ShettyApr 05, 2017 Recent Posts Image (More than just) How to Configure Apache Solr in Drupal 8 for a Powerful Search Experience Image Drupal Masquerade Module – A Brief Tutorial on How to Easily Switch Your Drupal Roles Image An Introduction to Design Patterns in PHP Want to extract the maximum out of Drupal? TALK TO US Featured Success Stories
Know more about our technology driven approach to recreate the content management workflow for 7.ailink
Find out how we transformed the digital image of world’s largest healthcare provider, an attribute that defined their global presence in the medical world.link
Discover how a Drupal powered internal portal encouraged the sellers at Flipkart to obtain the latest insights with respect to a particular domain.link
Drupal is quickly approaching a critical inflection point in terms of its ability to adapt to and outperform other technologies in the web development space, particularly in the front end. Trends like decoupled Drupal, are rapidly gaining adoption in the Drupal community, but such architectural approaches do not resolve the issue of how Drupal's front end can contend with the increasing focus on popular front-end technologies like React and Vue.Read more preston Mon, 04/27/2020 - 06:48
Consistent support of addresses, coordinates, maps and searching for places is definitely the Achilles' heel of most modern CMS. It is a complicated matter and it requires integrating a large number of independent solutions. The topic of geolocation has also been raised in many large projects implemented by Droptica. I will show you some best practices on their basis.
I will start by presenting the top 10 geolocation modules for Drupal. Familiarise yourself with them and choose the ones that best suit your needs. At the end, we will build together a sample places database, convenient to edit and eye-catching.Modules for address support Address
This module provides a field necessary to save the address, working for every country in the world. Thanks to it, you do not have to create separate fields for a street name, apartment number, city, and post code. The module's capabilities will be especially appreciated by people who need to address support for different countries. Its configuration is very simple: just select the address elements that you want to be visible or required. The set includes ready-made validators (including the ones for postal codes) and additional "Country" and "Zone" fields.
The Address module displays the saved addresses in their basic form; customising the appearance of these fields requires having knowledge of building your own templates and CSS coding. Alternatively, it is possible to combine address fields with other modules, which will be discussed later in the article.Modules from the geolocation family Geolocation Field
It is a module that adds a field with latitude and longitude to Drupal. Its basis is simple, but it also contains over 20 accompanying modules. In its simplest version, it contains a field definition, two types of controls for editing it (coordinates and an HTML5 widget that retrieves the location from the browser), and formatters that display the location as text (decimal, sexagesimal – in the form of degrees, minutes and seconds, and using tokens).
If you want to use fields with coordinates in the views, you gain some interesting features. You can, e.g. use distances in kilometres from a given point. This allows you to create a list of places within a given radius from the user. You can determine the position of the website visitor using the location service provided by the browser or based on the IP address.
Geolocation shows its full potential after running the secondary modules. Additional capabilities include:
- A map widget working both with commercial suppliers (Google Maps, Baidu, Yandex, Here) and with the Open Leaflet library (with OpenStreetMap map layers running by default). There are many configuration options here, and no additional modules are required.
- Support for static Google maps in the form of a standard PNG file.
- Experimental support for geometric shapes on maps – both simple and complex ones. Ready-made contours of countries and states of the USA and Canada are provided together with the Geolocation module.
- Integration with the Address module, providing access to a formatter with a map for address fields. It allows for showing a postal address on the map, without the need to enter geographic coordinates. Automatic conversion of a physical address into coordinates is done using the selected geocoding service (e.g. paid Google API or free Nominatom). The Geocoder module offers similar functionality.
- Integration with the Geofield module, adding a formatter with a map to fields with coordinates.
- Integration with the Search API module, adding the ability to limit the search to a given rectangular area.
It is worth mentioning that the architecture of the Geolocation module is open and well-refined, so you can easily add your functionality based on those listed above.Modules from the Geofield family Geofield
An alternative to the Geolocation Field module, being a somewhat competitive "ecosystem" of geographical modules. Geofield in its basic form is very simple and practical. In relation to the functionalities offered by Geolocation, Geofield also includes:
- Possibility to add the "Find my location" button to a standard widget with latitude and longitude.
- Widget with coordinate fields divided into degrees, minutes, seconds (but it does not support the "Find my location" button).
- Widget supporting the WKT (Well-known text) format describing vector objects on maps. Using WKT you can define an area of any shape instead of a specific point.
- Widget allowing you to define a bounding box – that is an area bounded by a rectangle.
Regarding the integration of Geofield with Views – fields and filters based on the distance to a given point are available, but their configuration options do not include automatic detection of the user's location.
There are not many field display options in the basic Geofield version. You can either choose one of the popular formats (WKT, EWKT, WKB, EWKB, GeoJSON, KML, GPX, GeoRSS, GeoHash) or simply presenting the coordinates as decimal or sexagesimal numbers. It is not possible to display the position on a map.
It would seem that the Geolocation module is a better choice, but the real strength of Geofield lies in the external add-on modules:Geofield Map
This module provides integration with the paid Google Maps API. It is very easy to configure, it contains a form widget for selecting locations (with address-based coordinate search), a formatter showing a given place on Google Maps, and a new view display mode that allows displaying results in the form of pins on a map. Advanced users have a lot to choose from, starting with a map layer and ending with their own icons.
The biggest disadvantage of the Geofield Map module is the inevitable cost of using Google API. This is important for websites with a high number of pageview hits. The creators of the module have implemented three mechanisms that allow for significant savings:
- Support for the free OpenStreetView service via the Leaflet.js library. This applies only to the edit form widget.
- Support for static maps, i.e. generated by Google in the form of a PNG file. They are not interactive, but they are sufficient for some applications.
- Support for simple Google maps available without limits for free. They are interactive, but their configuration options are very limited.
An alternative to Geofield Map, based on the open Leaflet.js library. By default, it includes support for the free OpenStreetMap website. The most important functionalities of the module are:
- Solid Geofield field formatter that allows you to configure every aspect of the map.
- Views support – map display mode or using the Views GeoJSON module to show structures more complex than individual points.
- Built-in support for the Leaflet.markercluster library grouping pins when zooming a map out.
- Ability to define map layers using a hook.
- Support for form widgets supported by the external Leaflet Widgetmodule. It is a highly advanced functionality, including drawing geometric shapes and placing various types of icons. For basic applications, it is better to use the Leaflet.js widget provided by Geofield Map that has been described earlier.
To sum up – you have a fully free solution here, which should be enough for most needs.Geocoder
When describing the modules from the Geofield ecosystem, I have not yet mentioned the conversion of a postal address into coordinates. This is the domain of the perfectly designed Geocoder module. It expands widgets, formatters, and views with the possibility of geocoding based on several paid and free providers. The selection of a supplier is carried out separately for every element being configured, which allows for planning the possible costs very accurately.
With the help of Geocode you can, e.g.:
- Show coordinates as a postal address (through a separate dedicated formatter).
- Include a location search by address in edit forms.
- Read an address from fields of any type (also those defined by the Address module) and replace it with the position on a map (and vice versa – replace coordinates with an address). Later in the article, I will present a sample configuration for such a solution.
- It is also worth to use the Geocoder AJAX Prepopulate module allowing for finding coordinates based on an address, without leaving the editing form. This is because the standard implementation contained in Geocoder allows for geocoding only when the form is being saved. If there is a need to make corrections to a map location, you must enter the edit option again.
There is a possibility that you do not need extensive geolocation options at all. The Simple Google Maps module is a significant simplification of what I have described so far. It just provides a map formatter that can be set for any text field. When you enter an address in this field, the end-user will see a Google map generated based on their address. It is free and you do not have to obtain the API key. For obvious reasons, it does not offer the integration with Views or a widget to mark the position on a map in a form.Sample places database
You have already learned about the basic modules for geolocation. Now it is time to use them on a specific example. We will create a places database where every entry will have a title, description, address, and coordinates. We will display the list of places in the form of pins on a map. We often encounter this type of solution when creating corporate websites, for example when publishing information about events or facilities of a given company.
We will need to install the following modules for the needs of our places database (I recommend doing it using Composer because external libraries are required):
- Address (https://www.drupal.org/project/address)
- Geofield (https://www.drupal.org/project/geofield)
- Geofield Map (https://www.drupal.org/project/geofield_map)
- Leaflet (https://www.drupal.org/project/leaflet) – including Leaflet Views and Leaflet Markercluster
- Geocoder (https://www.drupal.org/project/geocoder) – including Geocoder Address, Geocoder Field and Geocoder Geofield.
- Geocoder AJAX Prepopulate (https://www.drupal.org/project/geocoder_ajax_prepopulate)
Editing the places will definitely be the biggest challenge. We want the address to apply to any country. Every place must have coordinates, but they can be filled out based on the address. The address can also be filled out based on the coordinates. Such an approach will greatly simplify the work of content creators. An additional assumption is using only free tools
In the beginning, we create a new type of the Place content. We add the following fields to it:
- New Address field named field_address (hide name, surname, and company name in field settings)
- New Geofield field named field_coordinates
Then we determine the relationship between these two fields. We want the address field to be filled out based on the map if the user has not entered the address. And vice versa – for the place on the map to be filled out based on the address after pressing the button. In order to do this:
We enter the field_address field settings and in the Geocode section, we select the "Reverse Geocode" option (conversion of coordinates into an address). We choose "Coordinates" from the field list and also select "Skip Geocode/Reverse Geocode if the target value is not empty". We choose free Nominatim on the list of suppliers.
We enter the field_coordinates field settings and in the Geocode section, we choose the option "Prepopulate by geocoding an existing field" (conversion of an address into coordinates). We choose "Address" from the field list and also select "Skip Geocode/Reverse Geocode if the target value is not empty". We also choose Nominatim on the list of suppliers.
When we try to create a new place, we will quickly conclude that using latitude and longitude is inconvenient. It is much easier to use the map for this purpose. We go then to the form settings for the Place type of content and select the Geofield Map widget for the field_coordinates field. A warning about the unconfigured Google API key will appear, but we can ignore it. This is because we will use the free Leaflet.js library.
The last step is to configure the aforementioned Nominatim website. We go to the address /admin/config/system/geocoder and enter The root URL https://nominatim.openstreetmap.org and Locale pl. That is all!
Now let us try to add a new place. After filling out the address fields, we click the Populate from the Address field button, and with any luck, we should get the location of the address on the map. If we will not add the address at all, just a place on the map – the address will be read based on coordinates.Display
Now it is time to display the place. We go to display modes for our type of Place content and set a formatter for the field_coordinates field. Now let us check out the page with a functional map created just now:Views
Displaying a list of places is just as easy. We create a new view in which we show only the nodes with the Place content type. We set the Leaflet Map as the display format.
For the map to work, we need to add the field_coordinates field.
Then, in the Leaflet Map format settings, we select Grouping field No. 1 as "Content Coordinates".
We save the view, add a few places, go to the /places address. We get an interactive map with marked pins. We can quickly expand it with other functionalities, such as displaying only the places located 30 km from our location.Final thoughts
As you can see, the support for geolocation in Drupal requires a lot of knowledge, but it also allows you to create perfectly working solutions for your clients. The possibilities are almost limitless, while the time of implementation is relatively short. In our example, we have barely scratched the surface of the topic of maps and geocoding – it is actually much more extensive. I encourage you to experiment on your own, also trying out the paid components from Google!
To open our short series of articles highlighting ways that the Drupal software its community are building solutions to help combat the effect of COVID-19, today we hear from Taco Potze as they describe their project at the Biotechnology Innovation Organization.
The BIO Coronavirus Hub was rapidly created and deployed in 48 hours by the internal BIO Digital Team and Open Social. The Hub was developed as a response to the many requests from medical research centers, biopharmaceutical companies, testing developers, and testing sites for various supplies.
Open Social helped launch the new BIO Coronavirus Hub to connect companies and organizations that have relevant supplies, capacities, and resources to share, with those companies, researchers, or healthcare providers in need of those items. Already there 1200+ users with over 400 requests made inside the Hub.
BIO is the global trade association representing the biotechnology industry. Their role in this effort is to serve as a conduit to connect those in need with those who can share, many of whom may be BIO members. We do this through the BIO Coronavirus Hub, which is a public platform open to anyone in need and anyone who can help.
Drupal empowers Open Social to use its modules for people to use and build upon. Our existing ‘out of the box’ features included many of the functionalities BIO required. The most useful feature for them is the creation of a new supply/demand topic, which, when tagged with “supplies,” would be moved to a closed group after creation. The members of that group would then coordinate and process it.
The Group module in Open Social helped a lot here. We were able to quickly identify the tag and then move it to the group. This module ensured that we could empower BIO to achieve its mission. BIO could focus on configuring and filling their platform.
The speed with which this community was launched is a testament to how flexible Open Social and Drupal are. In this fast-moving environment, time is of the essence, through this hub BIO, and their partner Healthcare Ready was able to connect medical research centers, biopharmaceutical companies, and testing sites requesting supplies and inquiring about manufacturing capacity.
Open Social continues to work together with BIO to improve the online Hub.
We are all looking for online learning platforms to enhance our skills. We are also sometimes confused if it is ok to be investing online. Well, that’s where open source projects come in.
Open source enables us to innovate and grow with the digital pace.
Companies don't just seem to be using it but also contributing to open source projects to steer growth and revenue. Contributing to an open source can be a rewarding way to build your experience and skills.
According to Red Hat's the State of Enterprise Open Source survey, which was published in April 2019, 99 percent of IT leaders believe that open source software is important to their enterprise IT strategy.
As outlined in his influential essay on the subject, Why Open Source (which itself draws from David Wheeler's seminal paper, Why Open Source Software), Ben Balter states: "Open source isn’t a fad, or a bunch of hippies get in California passing around tie-dye laptops like they'd illicit substances. Open source is how modern organisations and increasingly more traditional organisations build software."Contributing to an open-source project
To see what it means to contribute to an open-source project, let’s take a look at the Firefox project. There are many fun and impactful ways to get involved with Firefox.
You can make both technical and non-technical contributions to Firefox. Technical contribution includes coding, documentation and designing whereas the non-technical contributions include helping in the user forums, replying to queries and also reviewing contributions.
The Firefox browser has thousands and thousands of lines of source code and information on how to use those codes, for example into a web browser. Anyone who wants to contribute to Firefox can make changes to the source code and then build a customised version of it. They can then send the customised version back to the main project maintainers.
There are a lot of developers who believe that coding should be open. The code will be exposed to everyone naturally and makes them focus on making it readable.
Lots of open source contributors start by being users of software they contribute to. After you find a bug in an open source software you use, you'll want to have a look at the source to work out if you'll patch it yourself. If that’s the case, then contributing the patch back is the best way to make sure that your friends (and yourself after you update to the subsequent release) are going to be ready to enjoy it.Designing
A Designer’s perspective to look at a project is completely different from others.
They make sure that everybody performing on the project understands users’ needs and stays focused on them because the community makes decisions. Open source projects need designers as much as any other contributor.
If your niche lies under designing, you can always contribute to user experience and help a lot of projects who need help.
It is very important for a designer to find a project whose goals they really support and understand.
Writers are very valuable to the open-source community. Because, the members have to contribute and communicate remotely and more often than not, in a non-native language.
Documentation, especially on open source projects, is never up-to-date.
Although, it is one of the easy ways to contribute to any open source project.
It is a part of the open-source projects which is often overlooked. The topics should be very easy for the writers. Text assets and translations also present an easy entry.
For instance, Drupal provides all sorts of writing and editing of documentation including community documentation, help pages within the core Drupal software, blog posts and more.
Contributing to Drupal as a writer will help you gain experience in technical writing and editing.It’s ok to not code
Whatever your reason is to avoid coding, there are always plenty of other contributions you can make!
There’s a lot of administrative work too. So, coding is optional. For the coders, who don’t want to code, they can help users on forums, reply to issues, translate tutorials, review contributions and much more which is as important in the open source community as coding. You can always write a blog post or publish your designs which is as important as any other contribution if you find the right project. You can also get involved in marketing, which is often lacking in open source projects. If you are able to contribute to it, you benefit the complete project and the community behind it.
Then there is the legal part which majorly concerns the large projects. There are many lawyers who have volunteered their time for open source projects, primarily.
You do not have to be worried or scared if you’re not capable of doing the above-mentioned things, the least you can do is report errors that you find in a project. And remember, every contribution is valuable.
Now that we know that contributing to an open source can be a great way to augment your skills, let’s talk about the benefits of being an open source contributor.Benefits
Contributing is fun
It will be a project that you have chosen, so you can enjoy it a bit more. When you contribute to an open source project, you get to know new people who are working on the same project. You can learn from them, and exchange ideas with them. You might also get to use stuff that you probably can’t in your day job.You increase your proficiency
It might be challenging at first, but you will slowly be able to identify and contribute to creating working solutions.
Contributing to an open source project can be a great exercise for someone looking for a career change. In an open source project, it doesn’t matter if you are a senior developer or a doctor or even a gardener. All that matters is the quality of your contribution. You will gain some valuable experience, once you prove yourself in an open source project. And once you prove yourself in an Open Source project, you will have gained some valuable experience. You will be an established member of a community with a track record that anyone can check.Contributing builds reputation
Behind every great open source software, there is a community that aims to make it even more popular. The advantage here is that you can improve your skills and get inspiration and support from like-minded people. If you are a freelancer, you can also increase your chance of being hired by professionals. Contributing to an open source project also increases the visibility of your other channels.
The visitors on the project that you contribute to can learn about your youtube channel or find your LinkedIn profile and offer you a job.
By contributing to an open source project, you are actually creating a real-life resume that anyone can verify. You don’t have to contact your former employer or client.
You might also get a paid offer if you’re really good at what you do.You find new employees
If you do something creative, publish it as an open-source project and there might also be a chance that someone would want to work for you.Helping them is helping yourself
Another advantage is that you gain gratitude when you contribute to a project.
The thank-you notes and the reviews you receive are always worth your time and efforts. And it feels great when you see your name appearing in a project. And let’s not forget, the experience is the best CV.Conclusion
Contributing to open source projects is free, it can be distributed and it can be modified.
You can always get out of your comfort zone and experiment with your skills. If you are a doctor and want to code, do it. If you’re a coder and you want to design, the open source world is your canvas!
And if you value the idea of contributing to open source, you are worthy of earning recognition, and being a respected member of the community.
To learn about contributing to Drupal, one of the largest and popular open-source projects, read the follow-up article which mentions all the perks of being a Drupal contributor. Ping us at email@example.com to know how you can be a part of a growing open source community.blog banner blog image Open Source Open Source Contribution Blog Type Articles Is it a good read ? On