Appnovation Technologies: Simple Website Approach Using a Headless CMS: Part 1

Planet Drupal - Wed, 2019/02/06 - 9:00am
Simple Website Approach Using a Headless CMS: Part 1 I strongly believe that the path for innovation requires a mix of experimentation, sweat, and failure. Without experimenting with new solutions, new technologies, new tools, we are limiting our ability to improve, arresting our potential to be better, to be faster, and sadly ensuring that we stay rooted in systems, processes and...
Categories:

Yuriy Gerasimov: Automating checking your Drupal's updates

Planet Drupal - 15 hours 42 min ago

Drupal updates can be very different. Some of them -- easy patches that you just roll out and forget. Some of them -- break your site. Tricky part is you never know how updates will behave on your site until you actually tried them out.

This is why it is very tricky to give estimates to clients how long it is going to take. They usually do not appreciate answer 1 to 20 hours depending on some random facts.

In this way rolling out updates got delayed and delayed. And then we get to situations after half a year or a year that we know for sure site will be broken after updates. And now hero time begins.

Would it be nice if site would tell you not only the fact that it needs updates but also if it is going to break or not after rolling them out.

Nowadays, thanks to Pantheon's multidev, it is technically possible to automate checking how your updates will behave on the site.

Main idea is to regularly check updates (using drush command) then if updates are found create a separate environment and roll updates there. Afterward to ensure that they didn't break the site (at least visually) we could run some visual regression testing. So in result we have way more predictable answer about "how much efforts it will take to roll updates out".





Here is a full article tutorial about how to set it up http://docs.diffy.website/tutorials/put-your-sites-updates-on-autopilot-with-pantheon-multidev-and-visual-testing.

For sure fixing smaller updates is much easier than fixing big break after year of delays.

Tags: drupal planetdrupal 8
Categories:

Dries Buytaert: Optimizing your product strategy for the short- and long-term

Planet Drupal - Sun, 2018/12/16 - 10:49pm

Most products cycle through the infamous Innovation S-curve, which maps a product's value and growth over time.

Startups are eager to find product-market fit, the inflection point in which the product takes off and experiences hockey-stick growth (the transition from phase one to phase two).

Just as important, however, is the stagnation point, or the point later in the S-curve when a product experiences growth stagnation (the transition from phase two to phase three). Many startups don't think about their stagnation point, but I believe they should, because it determines how big the product can become.

Ten years ago, a couple years after Acquia's founding, large organizations were struggling with scaling Drupal. I was absolutely convinced that Drupal could scale, but I also recognized that too few people knew how to scale Drupal successfully.

Furthermore, there was a lot of skepticism around Open Source scalability and security. People questioned whether a community of volunteers could create software as secure and scalable as their proprietary counterparts.

These struggles and concerns were holding back Drupal. To solve both problems, we built and launched Acquia Cloud, a platform to build, host, and manage Drupal sites.

After we launched Acquia Cloud, Acquia grew from $1.4 million in bookings in 2009 to $8.7 million in bookings in 2010 (600% year-over-year growth), and to $22 million in bookings by 2011 (250% year-over-year growth). We had clearly found product-market fit!

Not only did it launch Acquia in rocket-ship growth, it also extended our stagnation point. We on-boarded many large organizations and showed that Drupal can scale very large. This helped unlock a lot of growth for both Drupal and Acquia. I can say with certainty that many large organization that now use Drupal would not have adopted Drupal without Acquia.

Helping to grow Drupal — or extending Drupal's stagnation point — was always part of Acquia's mission. From day one, we understood that for Acquia to grow, Drupal had to grow.

Launching Acquia Cloud was a great business decision for Acquia; it gave us product market fit, launched us in hockey-stick growth, but also extended our S-curve.

As I think about how Acquia has approached the Innovation S-curve, a few important lessons stand out. My recommendation is to focus on opportunities that accomplish two things:

  • Focus on business opportunities that serve a burning customer need that can launch or accelerate your organization.
  • Focus on business opportunities that remove long-term barriers to growth and push out the stagnation point.
Categories:

OpenSense Labs: Cooking Drupal-based platform for food lovers

Planet Drupal - Sun, 2018/12/16 - 7:20pm
Cooking Drupal-based platform for food lovers Shankar Sun, 12/16/2018 - 23:50

Cooking something very delicious requires the right sort of ingredients and preparation. Serving delectable food can entice people and make them remember the taste for a long long time. Building a food-based website also requires the right platform that can engage people and offer ambitious digital experiences. This is where Drupal enters the scene.


As an open source CMS, Drupal is a reliable, secure and flexible platform and helps in creating the features that technology professionals want. It conforms with technical and business requirements and can be a great solution for powering the online presence of a food-based brand.

Drupal’s Greatness for Food-based Website

Drupal powers websites of some of the great names in the food industry like BBC Good Food, 24 Kitchen, Bosscaffe, Alevri among others. What prompts them to choose Drupal?

Drupal powers websites of some of the great names in the food industry Spectacular functionalities

Drupal offers a lot of modules, themes, and distributions that can help build a great food-based website.

For instance, a restaurant chain can use the make their Drupal website stylish with a free mobile-first, Bootstrap 3 based theme for Drupal called Restaurant Lite.

You can even leverage the benefits of Restaurant Zymphonies Theme which is another free mobile-first theme that can power the websites of restaurants, food stalls, hotels and other dining websites.

As a multifunctional, commercial profile created for intricate online stores, Food Delivery, a Drupal distribution, offers a complete starter pack for developing food-based websites.

Security

Drupal Security Team has been working round-the-clock validating and responding to security issues thereby making it one of the most secure open source CMS. Among the leading open source CMSs, it has been performing far better than Wordpress, Joomla and Magento.

Drupal is one of the most secure open source CMS Source: Alert LogicScalability

To help you cope with the busiest of days, Drupal can scale with your needs and can efficaciously handle a massive amount of traffic.

Multilingual

Drupal has out of the box support for building multilingual websites in the form of core modules that allows you to deliver localised digital experiences.

Mobile-responsive

Drupal helps in building responsive sites and web applications thereby allowing you to interact with your consumers on-the-go.


Speed

To run an agile team and ensure the continuous delivery of the web development project, Drupal’s pliable platform is superb. Furthermore, it helps in implementing performance optimisation techniques for building a high performing site.

Third-party integration

You can leverage the best tools outside the periphery of Drupal by integrating a variety of marketing technologies and business applications.

Content Workflow

Content authors can get awesome tools for creating and publishing content on the site. With the help of its preview feature, you can see how your content will look across various devices. Moreover, the authentication and permissions enhance the editorial workflow.

Content authors can get awesome tools for creating and publishing content on the site
Content architecture

Drupal lets you create the right content architecture and show appropriate content for each context with the help of great display mode tools, Views and a plentitude of media types.


Content-as-a-service

Its content as a service approach lets the front end developers build engaging customer experiences with Drupal’s presentation neutral content and RESTful API leveraging tools like Angular, Ember, Backbone and many more.

Multisite

Drupal’s immense capability in building a multisite architecture allows you to govern multiple websites across your enterprise brands, geographies and promotional campaigns on a  centralised platform.

Business-driven

You can build a solution that adheres to your business requirements and does things as your business demands.

Perfect tech stack

Drupal dwells on a modern LAMP technology stack comprising of Linux, Apache, MySQL and PHP which helps in fulfilling the needs of fast-moving, flexible, and agile enterprises in creating ambitious digital experiences.

Large community

Drupal’s vast community presence is one of its greatest strength as thousands of organisations create solutions with Drupal and in the process build Drupal itself.


Case study

24 Kitchen, one of the most popular platforms on food, cooking and lifestyle, has extracted the power of Drupal with the help of a digital agency to transform their presence in the digital landscape.

Search API Solr Search, a Drupal module for offering Solr backend, uses Apaches Solr servers for indexing and searching content and was utilised for related recipes in 24 Kitchen website development. The amazing capabilities of Fast Autocomplete module merged the extensive database of videos with the recipe’s database thereby allowing the retrieval of thousands of recipes that come with recommendations based on preference and user parameters.

Without any developer’s interference, the content editor could create flexible layout components, new pages and formats. This was made possible by paragraphs module. Through smart widgets, high-value content was disclosed which allowed other websites to directly query and present content from the 24 Kitchen within their own website. 

The Drupal website of 24 Kitchen has been very advantageous for marketing and sales with its boundless integration with the 24 Kitchen television format. It has paved the way for working with partners on promotional campaigns and cross-selling via numerous national news and lifestyle sites. The Drupal-powered website made it possible for content syndication and serving content to the consumers when relevant (for example, the Widgets and Facebook recipe bot). Ultimately, the redressal of 24 Kitchen website with the help of Drupal’s monumental capabilities resulted in staggering growth in page views.

Conclusion

Drupal offers a stupendous platform for building a powerful food-based website and allows the enterprises to grow as a brand.

We love Drupal and have been in our constant pursuit of delivering high quality and innovative solutions with our expertise in Drupal development.

Contact us at hello@opensenselabs.com to build a food-based site using Drupal.

blog banner blog image Blog Type Articles Is it a good read ? On
Categories:

OpenSense Labs: Conversational Commerce: From a Whisper to a Roar

Planet Drupal - Sun, 2018/12/16 - 6:37pm
Conversational Commerce: From a Whisper to a Roar Shankar Sun, 12/16/2018 - 23:07

We are continuing our march towards becoming an AI-first world where the mobile centricity is getting replaced with personalised user-centricity. This is an age where conversational commerce and artificial intelligence (AI) as a utility are altering the way we communicate with brands and with each other. One of the great examples is the LivePerson’s LiveEngage. As a leading conversational commerce platform, it allows the students of Barry University to interact over popular messaging services. They can ask anything ranging from the application processes to the courses available in a convenient manner like they do with their friends and family.


Conversational commerce is already making great inroads and is revolutionising the way consumers and brands interact with each other. It has the potential of being a curator of services and experiences that can intelligently fulfill the needs and engage consumers emotionally anytime and anywhere. Drupal Commerce, an open source e-commerce solution, can help in implementing conversational commerce. Before we look at Drupal Commerce’s capabilities for this implementation, let’s explore conversational commerce first.

Conversational Commerce: Explained  Source: Mücke, Sturm & Company GmbH

From the traditional point of sale systems (POS) to mobile commerce, we have seen it all. They have paved the way for conversational commerce which can dramatically metamorphose out shopping experience by leading a new era of individualised shopping.

Chris Messina coined the term Conversational Commerce looking at the dominant trend of consumer computing apps which came to life in 2015 with Uber’s integration into Facebook Messenger. He described it as a solution that can deliver “convenience, personalization, and decision support while people are on the go with only partial attention to spare”.

Conversational commerce is about delivering convenience, personalization, and decision support while people are on the go, with only partial attention to spare. - Chris Messina

Conversational Interfaces leverages the power of AI whilst using the technologies that consumers relish using like messaging, voice interface or other natural language interfaces thereby enabling people to interact with brands or services through bots.

With the objective of delivering a top-of-the-line customer experience, it helps in replicating the one-on-one, in-store salesperson experience with the consumers. Whether you are buying clothes, ordering food from a hotel, or making a financial transaction, conversational commerce is here to revolutionise the business model.

Why is Conversational Commerce Important?

Comscore states that 85% of the smartphone time is being spent on social media, messenger and other media applications. Moreover, by 2021, the number of users using digital voice assistants is projected to reach 1.8 Billion.

Massive adoption of messaging applications along with the rise of conversational, AI-powered technology can streamline the process of making one-on-one conversations at scale. And as people distance themselves from clicks and taps and inch towards conversational UI, more adoption of conversational commerce can be witnessed in the coming years as it promises a slew of benefits which are stated below:

  • Driving customer satisfaction: Offering voice assistants can enhance brands’ Net Promoter Scores (NPS®) thereby driving customer satisfaction.

 

  • Generating more business: Enterprises will be able to generate more business and positive word-of-mouth

 

 

  • Increasing consumer spending: Consumers show the willingness to increase their spending with a brand when they receive a good voice assistant experience.

 

  • Focussing on priority consumer segments: Enterprises that emphasise on the segmentation of priority consumers like targeting users who prefer voice assistants for purchases will bring in great value.

How does Conversational Commerce Work?

There are different ways to look at conversational commerce in action. One of the types of experiences can be a proactive model. In this, the customer buys something for the very first time from a shop and receive an automated message with greetings and thanking them for shopping at that store. Another working model can be a reactive approach. In this, a customer buys a defective product from a shop and seek assistance. In response, the shop guides them through a solution.

Source: Chatbots Magazine

Furthermore, there can be one-to-one and one-to-many working models. If a shop sends a message and personalised offer to a customer who has applauded their service on Twitter, this comes under the one-to-one approach. Suppose a restaurant chain has opened a new store. They can send a custom message to a targeted audience segment who live nearby. This comes under the one-to-many approach.

Also, a digital voice assistant like Amazon Alexa and Google Home can assist the consumer in finding stores, hotels, events and much more. It can also recommend products, access customer care service, book a cab, or even control smart home devices.

Conversational Commerce with Drupal

Decoupled Drupal Days 2018 had a session which showed how to integrate bots in Drupal Commerce. When it comes to bot’s business logic, Drupal Commerce turned out to be a great solution in addition to its robust capabilities in storing content and product details. Moreover, Drupal, being an API-first CMS, offers a plethora of APIs out-of-the-box and the Commerce Cart API module makes it easy to interact with carts in Drupal Commerce.


It showed how decoupled Drupal Commerce, bot frameworks and Natural Language Understanding (NLU) can be leveraged for providing a boundless shopping experience to the e-commerce websites which are built on Drupal using Drupal services layer.

Decoupled Drupal Commerce, bot frameworks and Natural Language Understanding can be leveraged for providing a boundless shopping experience

The session focussed on Drupal 8 core services and the creation of custom REST APIs. It also laid emphasis on utilising decoupled Drupal Commerce APIs and using Node.js and Bot framework for building a chatbot.

In addition to this, it also talked about the bot frameworks and other cognitive services that can be used to develop bots for different use cases. Several bot frameworks were considered like Facebook Messenger (wit.ai), Google Dialogflow, IBM Watson, Microsoft Bot Framework and open source conversational AI like Rasa. Ultimately, it all started with a framework called Open Source Bot Builder SDK for Node.js which is used for building bots. The application was hosted on Heroku and the Microsoft Bot Framework was used to integrate with Facebook.

In the demonstration, some of the common e-commerce functionalities like search, exploring products and many more were exposed as REST APIs. The main idea was that the bots will enable search and explore the products by incorporating Drupal Commerce APIs. On the basis of message-based interaction, bots can also enable simple Add To Cart and Review Cart functionality among others and can offer relevant actions while looking for a product.

Conclusion

Conversational Commerce represents an astronomical opportunity for the brands and the retailers alike. It can enable them to interact with their consumers in a new and innovative way. Enterprises must extract the power of this interaction opportunity for building relationships of value with consumers across the lifecycle.

Conversational Commerce, along with Drupal Commerce, can help the organisations to offer a completely new and more instinctive way for consumers to engage with them.

We have been steadfast in our goals of powering digital innovation for the enterprises with our expertise in Drupal development.

Contact us at hello@opensenselabs.com to implement conversational commerce in your business.

blog banner blog image Conversational Commerce Drupal 8 Drupal Commerce Blog Type Articles Is it a good read ? On
Categories:

OpenSense Labs: Creating a Custom REST Resource for GET Method in Drupal 8

Planet Drupal - Sun, 2018/12/16 - 9:04am
Creating a Custom REST Resource for GET Method in Drupal 8 Anmol Sun, 12/16/2018 - 16:44

As the world gets more connected, web technologies too, need to get connected. RESTful web services can be used as an application program interface to connect various service.

This can be made possible by using HTTP requests to GET, PUT, POST and DELETE data. It is based on representational state transfer (REST) technology, an architectural style, and approach to communications often used in web services development. 

With decoupled development getting the ground, it has become important for the developers to understand teh REST technology better. Drupal provides its developers an in-house build method to use this REST technology. RESTful Web Services module, which is now a part of Drupal core, provides REST services to its developers. 

It allows users to read or update data on the site. 

Different verbs or request methods that are used for the approach are:

GET, HEAD, POST, PUT, DELETE, TRACE, OPTIONS, CONNECT and PATCH

The GET method allows the user to access various entities like Node, User, Taxonomy, Comments and even watchdog database log entries. GET method is a SAFE method as there is no database manipulation operation, it is a read-only request.

Unlike GET, in the POST method, database manipulation takes place. The user can update the database and create a node if need be. 

Creating REST Resource Plugins

REST resource plugins are important to expose additional resources over REST. In the steps below we will create a basic example for understanding the fundamental functionality of developing REST resource endpoint which takes a content type as argument and returns title and node ids of all nodes of that particular content-type.

  • Create the staging code for REST resource with the following Drupal console command:
drupal generate:plugin:rest:resource

You can also generate the staging code manually by creating a plugin file under src in the custom module > Plugin > Rest > Resource > {ResourceClassName}.php

Next, define the namespace, dependencies, and the plugin class as given in the example below.

  • Next, you need to create the @RestResource annotation. It helps you indicate the dependencies, and status of the class. Another important point to consider is that the URL mapping is case-sensitive.

    We must also be careful with the  @RestResource  annotation's urli_paths  which take link relation types as keys, and partial URIs as values. If you don't specify any, Drupal will automatically generate URI paths (and hence URLs) based on the plugin ID. 

    If your plugin ID is rest_example, you'll end up with/rest_example/{id}  for GET|PATCH|DELETE and /rest_example for POST . But, often, you'll want to specify your own paths: a canonical URI path (for example /todo/{todo_id}). For example:

* uri_paths = { * "canonical" = "/rest_example/{id}", * }

Here’s how to get the complete REST resource GET request

<?php namespace Drupal\rest_example\Plugin\rest\resource; use Drupal\rest\ModifiedResourceResponse; use Drupal\rest\Plugin\ResourceBase; use Drupal\rest\ResourceResponse; use Symfony\Component\HttpKernel\Exception\AccessDeniedHttpException; /**  * Provides a resource to get view modes by entity and bundle.  *  * @RestResource(  *   id = "rest_example",  *   label = @Translation("Rest example"),  *   uri_paths = {  *     "canonical" = "/rest/api/get/node/{type}"  *   }  * )  */ class RestExample extends ResourceBase {   /**    * Responds to GET requests.    *    * @return \Drupal\rest\ResourceResponse    *   The HTTP response object.    *    * @throws \Symfony\Component\HttpKernel\Exception\HttpException    *   Throws exception expected.    */   public function get($type = NULL) {     // You must to implement the logic of your REST Resource here.     // Use current user after pass authentication to validate access.     if (!(\Drupal::currentUser)->hasPermission('access content')) {        throw new AccessDeniedHttpException();      }     if($type) {       $nids = \Drupal::entityQuery('node')->condition('type',$type)->execute();       if($nids){         $nodes =  \Drupal\node\Entity\Node::loadMultiple($nids);         foreach ($nodes as $key => $value) {           $data[] = ['id' => $value->id(),'title' => $value->getTitle()];         }       }     }     $response = new ResourceResponse($data);     // In order to generate fresh result every time (without clearing      // the cache), you need to invalidate the cache.     $response->addCacheableDependency($data);     return $response;   } }

Output 

[   {     "id": "67",     "title": "Consectetuer Sits"   },   {     "id": "69",     "title": "Eum Paulatim"   },   {     "id": "70",     "title": "Tation"   } ]
Configuring the REST Resources

In the config file, we define which HTTP method, serialization format, and an authentication mechanism is supported by the plugin. 

This config file is necessary to use the REST resource plugin. There are two ways to configure the REST resource @RestResource plugin:

  1. REST UI Module
     
    • The REST UI module is not part of the core. So you have to download it and install it like any other module in your Drupal site.

       

    • Once installed, go to  /admin/config/services/rest and enable your REST Resource.

       

    • Select your desired settings. To have hal_json format, you can enable Drupal Core’s HAL module.
  2. Manually
    • Go to the config directory and create a config file with name as rest.resource.{resource id}.yml.
    • Add the following content.
langcode: en status: true dependencies: module: - rest_example - serialization - user id: rest_example plugin_id: rest_example granularity: resource configuration: methods: - GET formats: - json authentication: - cookie
  • Save this file as rest.resource.plugin_id.yml (rest.resource.rest_example.yml in this case) in your config directory and run config sync.

Defining your resource config:

Status: true/false ( enable or disable) Id: unique id Plugin_id: unique id configuration=>methods: Name all request method you want to use in this plugin configuration=>formats: Define all supported serialized format configuration=>authentication: Name all supported authentication method. By default, the REST module supports json and xml

Key Point: Each REST resource must be annotated with  @RestResource  annotation so they can be discovered by RESTful Web Services module.

Custom REST API is useful in sending data to third-party applications and for headless architecture. As of today, there is hardly any project or application that doesn't have a REST API. Connect with us at hello@opensenselabs.com to help you create

blog banner blog image Blog Type Tech Is it a good read ? On
Categories:

Red Route: A framework for progressively decoupled Drupal: Introducing the SPALP module

Planet Drupal - Sat, 2018/12/15 - 7:42am

This article was originally posted on the Capgemini Engineering blog

A lot of people have been jumping on the headless CMS bandwagon over the past few years, but I’ve never been entirely convinced. Maybe it’s partly because I don’t want to give up on the sunk costs of what I’ve learned about Drupal theming, and partly because I'm proud to be a boring developer, but I haven’t been fully sold on the benefits of decoupling.

On our current project, we’ve continued to take an approach that Dries Buytaert has described as "progressively decoupled Drupal". Drupal handles routing, navigation, access control, and page rendering, while rich interactive functionality is provided by a JavaScript application sitting on top of the Drupal page. In the past, we’d taken a similar approach, with AngularJS applications on top of Drupal 6 or 7, getting their configuration from Drupal.settings, and for this project we decided to use React on top of Drupal 8.

There are a lot of advantages to this approach, in my view. There are several discrete interactive applications on the site, but the bulk of the site is static content, so it definitely makes sense for that content to be rendered by the server rather than constructed in the browser. This brings a lot of value in terms of accessibility, search engine optimisation, and performance.

A decoupled system is almost inevitably more complex, with more potential points of failure.

The application can be developed independently of the CMS, so specialist JavaScript developers can work without needing to worry about having a local Drupal build process.

If at some later date, the client decides to move away from Drupal, or at the point where we upgrade to Drupal 9, the applications aren’t so tightly coupled, so the effort of moving them should be smaller.

Having made the decision to use this architecture, we wanted a consistent framework for managing application configuration, to make sure we wouldn't need to keep reinventing the wheel for every application, and to keep things easy for the content team to manage.

The client’s content team want to be able to control all of the text within the application (across multiple languages), and be able to preview changes before putting them live.

There didn’t seem to be an established approach for this, so we’ve built a module for it.

As we've previously mentioned, the team at Capgemini are strongly committed to supporting the open source communities whose work we depend on, and we try to contribute back whenever we can, whether that’s patches to fix bugs and add new features, or creating new modules to fill gaps where nothing appropriate already exists. For instance, a recent client requirement to promote their native applications led us to build the App Banners module.

Aiming to make our modules open source wherever possible helps us to think in systems, considering the specific requirements of this client as an example of a range of other potential use cases. This helps to future-proof our code, because it’s more likely that evolving requirements can be met by a configuration change, rather than needing a code change.

So, guided by these principles, I'm very pleased to announce the Single Page Application Landing Page module for Drupal 8, or to use the terrible acronym that it has unfortunately but inevitably acquired, SPALP.

On its own, the module doesn’t do much other than provide an App Landing Page content type. Each application needs its own module to declare a dependency on SPALP, define a library, and include its configuration as JSON (with associated schema). When a module which does that is installed, SPALP takes care of creating a landing page node for it, and importing the initial configuration onto the node. When that node is viewed, SPALP adds the library, and a link to an endpoint serving the JSON configuration.

Deciding how to store the app configuration and make all the text editable was one of the main questions, and we ended up answering it in a slightly "un-Drupally" way.

On our old Drupal 6 projects, the text was stored in a separate 'Messages' node type. This was a bit unwieldy, and it was always quite tricky to figure out what was the right node to edit.

For our Drupal 7 projects, we used the translation interface, even on a monolingual site, where we translated from English to British English. It seemed like a great idea to the development team, but the content editors always found it unintuitive, struggling to find the right string to edit, especially for common strings like button labels. It also didn't allow the content team to preview changes to the app text.

We wanted to maintain everything related to the application in one place, in order to keep things simpler for developers and content editors. This, along with the need to manage revisions of the app configuration, led us down the route of using a single node to manage each application.

This approach makes it easy to integrate the applications with any of the good stuff that Drupal provides, whether that’s managing meta tags, translation, revisions, or something else that we haven't thought of.

The SPALP module also provides event dispatchers to allow configuration to be altered. For instance, we set different API endpoints in test environments.

Another nice feature is that in the node edit form, the JSON object is converted into a usable set of form fields using the JSON forms library. This generic approach means that we don’t need to spend time copying boilerplate Form API code to build configuration forms when we build a new application - instead the developers working on the JavaScript code write their configuration as JSON in a way that makes sense for their application, and generate a schema from that. When new configuration items need to be added, we only need to update the JSON and the schema.

Each application only needs a very simple Drupal module to define its library, so we’re able to build the React code independently, and bring it into Drupal as a Composer dependency.

The repository includes a small example module to show how to implement these patterns, and hopefully other teams will be able to use it on other projects.

As with any project, it’s not complete. So far we’ve only built one application following this approach, and it seems to be working pretty well. Among the items in the issue queue is better integration with configuration management system, so that we can make it clear if a setting has been overridden for the current environment.

I hope that this module will be useful for other teams - if you're building JavaScript applications that work with Drupal, please try it out, and if you use it on your project, I'd love to hear about it. Also, if you spot any problems, or have any ideas for improvements, please get in touch via the issue queue.

Tags:  Capgemini development Drupal Javascript open source All tags
Categories:

Agiledrop.com Blog: The Story of Agiledrop: Introduction

Planet Drupal - Fri, 2018/12/14 - 2:50pm

We've started a series of blog posts that tell the story of what makes our developers successful when working with other Drupal teams. The first chapter introduces our workflow and the advantages it brings.

READ MORE
Categories:

OpenSense Labs: SCORM and E-Learning. Can Drupal Fit In?

Planet Drupal - Fri, 2018/12/14 - 1:02pm
SCORM and E-Learning. Can Drupal Fit In? Vasundhra Fri, 12/14/2018 - 17:32

Referred to as the de facto standard of e-learning, Shareable Content Object Reference Model aka SCORM was sponsored by US Department of Defense to bring uniformity in the standards of procuring both training content and Learning Management Systems. 

Long gone but not forgotten are those days when learning was only limited to books and classrooms. With the development of technology, virtual learning has transformed into an approachable and convenient method.

Can Drupal, which is a widely popular CMS for education websites, conform to SCORM standards? How does it ensure that it remains SCORM compliant? 


In Details - What is SCORM?

SCORM is a set of standard guidelines and specifications for the programmers on how to create LMS and training content to be shared across systems. 

The agenda to bring SCORM was to create standard units of training and educational material to be shared and reused across systems. 
                           


Shareable Content Object refers to creating units of online training material that can be shared and reused across systems and contexts.

Reference Model refers to the existing standards in the education industry while informing developers on how to properly use them together.

Working with the authoring tools to design and produce the content, e-learning professionals, training managers, and instructional designers are the ones who typically use SCORM packages.

Content (used in courses and LMS) is exported to a SCORM package (.zip folder) to deliver the learners a seamless and smooth upload of the content.

The Evolution of SCORM

Since SCORM wasn’t built as a standard from the ground up and was primarily a reference to the existing ones, the goal was to create an interoperable system that will work well with other systems. 

Till date, there are three released versions of SCORM, each built on top of the previous one solving the problem of its predecessor.

SCORM 1.0 was merely a draft outline of the framework. It did not include any fully implementable specifications but rather contained a preview of work which was yet to come. 

SCORM 1.0 included the core elements that would become the foundation of SCORM.  

In other words, this version specified how the content should be packaged. How content should communicate to systems and how the content should be described.

  • SCORM 1.1

SCORM 1.1 was the first implementable version of SCORM. It marked the end of the trial implementation phase and the beginning of the application phase for ADL. 

  • SCORM 1.2

SCORM 1.2 solved the many problems that came along version 1.1. It provided with robust and implementable specifications, this version presented its end users with drastic cost savings. 

It was and still remains one of the most widely used version.

  • SCORM 2004 (1st - 4th edition)

The 2004 1st edition allowed content vendors to create navigation rules between SCOs. The 2nd edition covered the various shortcomings of the 1st. It brought with it Advanced Distributed Learning which focused on developing and assessing the distributed learning prototypes, enabling more effective, efficient, & affordable learner-centric solutions.

The 3rd edition removed any ambiguity, improving the sequencing specifications for greater interoperability.

The final and 4th edition was focused on disambiguation and addition of new sequencing specifications. These specifications widened the options available to the content authors which made the creation of sequenced content even more simple.
 


Why Should You Use SCORM?

Now that we have an idea about SCORM and its attempt of reducing chaos in the entire industry, let’s know what benefits it brings along. 

Here are some of the reasons that can contribute to a huge factor in terms of using SCORM.

  • It is a pro-consumer initiative. The online courses are eligible to be used on any compliant LMS vendor. You can alternatively upload the courses to LMS as long as you have a zip folder.
  • All the high-quality LMSs and the authoring tools are SCORM compliant so that they can build and be part of a great ecosystem of interoperability and reliability.
  • The introduction and evolution in SCORM have brought about a great reduction in overall cost of delivering training. The reason is that it has no additional cost for integrating any type of content. 
  • SCORM helps in standardizing eLearning specifications. SCORM provides a set of technical specifications that gives the developers a standard blueprint to work with.
How does SCORM Work?

Other than guiding the programmers, SCORM administers two main things, i.e packaging content and exchanging data at runtime to ensure workability. 

  • Packaging content or content aggregation model (CAM) defines how a piece of content should be presented in a physical sense. It is required by the LMS to export and import a launch content without the use of any human interventions
  • Runtime communication or data exchange helps in defining how the content is supposed to work with the LMS while it is actually being played. This is the part which describes the delivery and tracking of the content. Eventually, these are the things that include “request the learner’s name” or “tell the LMS that the learner scored 95% in a test”. 
“SCORM recommends contents to be delivered in a self-contained directory or a ZIP file.”
Working of SCORM Packages

SCORM recommends contents to be delivered in a self-contained directory or a ZIP file. These files contain content, defined by the SCORM standards and is called Package Interface File (PIF) or in other words SCORM packages. 

It contains all the files that are needed to be delivered in the content packages via SCORM runtime environment. 

Course manifest files are considered as the heart of the SCORM content packaging system. The manifest is considered as the XML file that describes the content. 

Some of the pieces involved in the packaging are:

  • Resources 

Resources are the list of parts that bundle up to be a single course. There are two types of resources that contribute to the course.

The first is the collection of one or more files that make up a logical unit presented to the users. The other is SCO or Sharable Content Object which is the unit of instructions that are composed of one or more files, to communicate with LMS. It mostly contains the instructional or static part of a content that is presented to the users via course. 

Resources should contain a complete list of all the files that are required for proper functionality of the resources. 

This is done to port the list to a new environment and function it the similar way. 

 

  • Organizations

Organizations are considered as the logical grouping of the parts of resources into a hierarchical arrangement. This is what is delivered to a particular learner when the item has been selected. 

  • Metadata 

Metadata are used to describe elements of a content package in its manifest file. They are important because they facilitate the discovery of learning resources across content package or in a repository. 

When a learning resource is intended to be reusable, it is a best practice to describe it with metadata. 

For describing learning content, Learning Object Metadata contains many predefined fields.   
  • Sequencing

Sequencing is responsible for determining what happens next when a learner exits an SCO. With navigational control, it orchestrates the flow and status of the course as a whole. 

However, it doesn’t affect how SCOs operate and navigate internally, that is defined by the content developer.

Drupal With SCORM 

Drupal is best at managing the digital content, but the task of planning, implementing, and assessing a specific learning process can be best done by an LMS.

How can Drupal become a platform for an organization that delivers effective training, manage learners, individual progress and record results?

Since Drupal is not an LMS, its distributions and modules help it become more effective. When it comes to SCORM compliance, Drupal has Opigno LMS as its core distribution.  

Opigno LMS is a Drupal distribution that integrates H5P technology (an open-source content collaboration framework based on javascript), which enable you to create rich interactive training content. It allows you to maintain the training paths that are organized in courses and lessons. 

This distribution includes the latest version of Opigno core that offers you effective and innovative online training tools.

Opigno LMS is fully compliant with SCORM (1.2 and 2004 v3) which offers a powerful editor for content management, in particular, to create course material. These courses can eventually be grouped into classes to provide easy and manageable training paths. It should also be noted that this distribution is the quickest way to present a functional e-learning platform out of the box, with the users, courses, certificates, etc. 

Based on this distribution, Opigno SCORM implements the SCORM feature in Opigno which allows you to load and play SCORM packages within Opigno training and is also responsible to handle and manage training paths that are organized in courses and lessons. 

Opigno LMS comprises an app store that also enables you to install latest features easily, without asking you to upgrade the current install. 

According to the requirements and expectations of the learners, Opigno LMS can be summarized by the following specification:

  1. Scalable to manage the hardships of a dynamic and modifying environment
  2. Safe and easy to update
  3. Support further development of customized functionalities with proper integration with the core solution in a modular way
  4. Open to letting each client be free and independent
  5. And most importantly, easy integration with other enterprise systems 

H5P javascript framework makes it easy to create, share and reuse HTML5 content and applications, allowing users to built richer content. With the use of H5P, the authors can edit and construct videos, presentation games, advertisement etc. To create an e-learning platform, the integration of HP5 framework and SCORM is essential.  


H5P SCORM/xAPI module allows to upload and view SCROM and xAPI packages. It uses two HP5 libraries namely (HP5 libraries are used to create and share rich content and applications)

  1. H5P SCORM/xAPI library to view SCORM package.
  2. H5PEditor SCORM library to upload and validate SCORM package.

You can create a new content type by uploading it in the preceding step of a process using the H5P editor.

In the nutshell

Different people adopt SCORM for different reasons. You and your team are the only ones that can decide whether sticking to SCORM is worthwhile or not. 

Depending upon the nature of your requirement and the course of action, it can be decided which platform is best for you. At OpenSense labs, we have been giving adequate solutions to our customers. Contact us on hello@opensenselabs.com to make the right decision on the correct choice of a platform. 

blog banner blog image Drupal Drupal 8 SCORM LMS Learning Management System Shareable Content Object Reference Model SCORM 1.0 SCORM 1.1 SCORM 1.2 SCORM 2004 E-learning Content Aggregation Model Organization Sequencing Resource Opigno LMS Blog Type Articles Is it a good read ? On
Categories:

Grazitti Interactive: Grazitti’s Drupal Marketo Connector Helps Personalize Content on Your Drupal Website

Planet Drupal - Fri, 2018/12/14 - 8:02am

Drupal is one of the leading open source and secure content management systems used by businesses across the world. Drupal [...]

READ MORE
Categories:

Capgemini Engineering: A framework for progressively decoupled Drupal

Planet Drupal - Fri, 2018/12/14 - 1:00am

A lot of people have been jumping on the headless CMS bandwagon over the past few years, but I’ve never been entirely convinced. Maybe it’s partly because I don’t want to give up on the sunk costs of what I’ve learned about Drupal theming, and partly because I’m proud to be a boring developer, but I haven’t been fully sold on the benefits of decoupling.

On our current project, we’ve continued to take an approach that Dries Buytaert has described as “progressively decoupled Drupal”. Drupal handles routing, navigation, access control, and page rendering, while rich interactive functionality is provided by a JavaScript application sitting on top of the Drupal page. In the past, we’d taken a similar approach, with AngularJS applications on top of Drupal 6 or 7, getting their configuration from Drupal.settings, and for this project we decided to use React on top of Drupal 8.

There are a lot of advantages to this approach, in my view. There are several discrete interactive applications on the site, but the bulk of the site is static content, so it definitely makes sense for that content to be rendered by the server rather than constructed in the browser. This brings a lot of value in terms of accessibility, search engine optimisation, and performance.

A decoupled system is almost inevitably more complex, with more potential points of failure.

The application can be developed independently of the CMS, so specialist JavaScript developers can work without needing to worry about having a local Drupal build process.

If at some later date, the client decides to move away from Drupal, or at the point where we upgrade to Drupal 9, the applications aren’t so tightly coupled, so the effort of moving them should be smaller.

Having made the decision to use this architecture, we wanted a consistent framework for managing application configuration, to make sure we wouldn’t need to keep reinventing the wheel for every application, and to keep things easy for the content team to manage.

The client’s content team want to be able to control all of the text within the application (across multiple languages), and be able to preview changes before putting them live.

There didn’t seem to be an established approach for this, so we’ve built a module for it.

As we’ve previously mentioned, the team at Capgemini are strongly committed to supporting the open source communities whose work we depend on, and we try to contribute back whenever we can, whether that’s patches to fix bugs and add new features, or creating new modules to fill gaps where nothing appropriate already exists. For instance, a recent client requirement to promote their native applications led us to build the App Banners module.

Aiming to make our modules open source wherever possible helps us to think in systems, considering the specific requirements of this client as an example of a range of other potential use cases. This helps to future-proof our code, because it’s more likely that evolving requirements can be met by a configuration change, rather than needing a code change.

So, guided by these principles, I’m very pleased to announce the Single Page Application Landing Page module for Drupal 8, or to use the terrible acronym that it has unfortunately but inevitably acquired, SPALP.

On its own, the module doesn’t do much other than provide an App Landing Page content type. Each application needs its own module to declare a dependency on SPALP, define a library, and include its configuration as JSON (with associated schema). When a module which does that is installed, SPALP takes care of creating a landing page node for it, and importing the initial configuration onto the node. When that node is viewed, SPALP adds the library, and a link to an endpoint serving the JSON configuration.

Deciding how to store the app configuration and make all the text editable was one of the main questions, and we ended up answering it in a slightly “un-Drupally” way.

On our old Drupal 6 projects, the text was stored in a separate ‘Messages’ node type. This was a bit unwieldy, and it was always quite tricky to figure out what was the right node to edit.

For our Drupal 7 projects, we used the translation interface, even on a monolingual site, where we translated from English to British English. It seemed like a great idea to the development team, but the content editors always found it unintuitive, struggling to find the right string to edit, especially for common strings like button labels. It also didn’t allow the content team to preview changes to the app text.

We wanted to maintain everything related to the application in one place, in order to keep things simpler for developers and content editors. This, along with the need to manage revisions of the app configuration, led us down the route of using a single node to manage each application.

This approach makes it easy to integrate the applications with any of the good stuff that Drupal provides, whether that’s managing meta tags, translation, revisions, or something else that we haven’t thought of.

The SPALP module also provides event dispatchers to allow configuration to be altered. For instance, we set different API endpoints in test environments.

Another nice feature is that in the node edit form, the JSON object is converted into a usable set of form fields using the JSON forms library. This generic approach means that we don’t need to spend time copying boilerplate Form API code to build configuration forms when we build a new application - instead the developers working on the JavaScript code write their configuration as JSON in a way that makes sense for their application, and generate a schema from that. When new configuration items need to be added, we only need to update the JSON and the schema.

Each application only needs a very simple Drupal module to define its library, so we’re able to build the React code independently, and bring it into Drupal as a Composer dependency.

The repository includes a small example module to show how to implement these patterns, and hopefully other teams will be able to use it on other projects.

As with any project, it’s not complete. So far we’ve only built one application following this approach, and it seems to be working pretty well. Among the items in the issue queue is better integration with configuration management system, so that we can make it clear if a setting has been overridden for the current environment.

I hope that this module will be useful for other teams - if you’re building JavaScript applications that work with Drupal, please try it out, and if you use it on your project, I’d love to hear about it. Also, if you spot any problems, or have any ideas for improvements, please get in touch via the issue queue.

A framework for progressively decoupled Drupal was originally published by Capgemini at Capgemini Engineering on December 14, 2018.

Categories:

Hook 42: 'Tis the Season for Giving: A Retrospective on Drupal Contribution

Planet Drupal - Fri, 2018/12/14 - 12:02am

'Tis the season for giving. This is the first of many articles about why and how to give back to the community. The information can be used by individuals, agencies, and companies that want to increase their community contribution efforts.

Categories:

Lullabot: The Georgia.gov Content Strategy Team

Planet Drupal - Thu, 2018/12/13 - 10:53pm
Mike and Matt talk with the team that helped implement content strategy on Georgia.gov.
Categories:

Texas Creative: Doing Business with Government Agencies

Planet Drupal - Thu, 2018/12/13 - 10:30pm

Learn what steps your full-service ad agency needs to take in order to work with Federal, State, and Local Agencies. Public affairs campaigns and website development are today’s big business opportunity.

Read More
Categories:

clemens-tolboom pushed to master in clemens-tolboom/p5js-examples

On github - Thu, 2018/12/13 - 2:14pm
clemens-tolboom pushed to master in clemens-tolboom/p5js-examples Dec 13, 2018 2 commits to master

Wim Leers: State of JSON:API (December 2018)

Planet Drupal - Thu, 2018/12/13 - 1:58pm

Gabe, Mateu and I just released the third RC of JSON:API 2, so time for an update! The last update is from three weeks ago.

What happened since then? In a nutshell:

RC3

Curious about RC3? RC2 → RC3 has five key changes:

  1. ndobromirov is all over the issue queue to fix performance issues: he fixed a critical performance regression in 2.x vs 1.x that is only noticeable when requesting responses with hundreds of resources (entities); he also fixed another performance problem that manifests itself only in those circumstances, but also exists in 1.x.
  2. One major bug was reported by dagmar: the ?filter syntax that we made less confusing in RC2 was a big step forward, but we had missed one particular edge case!
  3. A pretty obscure broken edge case was discovered, but probably fairly common for those creating custom entity types: optional entity reference base fields that are empty made the JSON:API module stumble. Turns out optional entity reference fields get different default values depending on whether they’re base fields or configured fields! Fortunately, three people gave valuable information that led to finding this root cause and the solution! Thanks, olexyy, keesee & caseylau!
  4. A minor bug that only occurs when installing JSON:API Extras and configuring it in a certain way.
  5. Version 1.1 RC1 of the JSON:API spec was published; it includes two clarifications to the existing spec. We already were doing one of them correctly (test coverage added to guarantee it), and the other one we are now complying with too. Everything else in version 1.1 of the spec is additive, this is the only thing that could be disruptive, so we chose to do it ASAP.

So … now is the time to update to 2.0-RC3. We’d love the next release of JSON:API to be the final 2.0 release!

P.S.: if you want fixes to land quickly, follow dagmar’s example:

If you don't know how to fix a bug of a #drupal module, providing a failing test usually is really helpful to guide project maintainers. Thanks! @GabeSullice and @wimleers for fixing my bug report https://t.co/bEkkjSrE8U

— Mariano D'Agostino (@cuencodigital) December 11, 2018
  1. Note that usage statistics on drupal.org are an underestimation! Any site can opt out from reporting back, and composer-based installs don’t report back by default. ↩︎

  2. Since we’re in the RC phase, we’re limiting ourselves to only critical issues. ↩︎

  3. This is the first officially proposed JSON:API profile! ↩︎

Categories:

Agiledrop.com Blog: Interview with Kevin Kaland, aka wizonesolutions: Towards a more and more decoupled Drupal

Planet Drupal - Thu, 2018/12/13 - 1:03pm

Meet Kevin Kaland, perhaps more easily recognized by his Twitter handle wizonesolutions, the digital wizard responsible for the FillPDF module. In this interview, he talks about his first interactions with Drupal and reveals his thoughts on the future of Drupal as a decoupled system. 

READ MORE
Categories:

Drupal Association blog: Introducing the (unofficial) Drupal Recording Initiative

Planet Drupal - Wed, 2018/12/12 - 10:53pm

This is a guest blog by Kevin Thull (kthull).

Shout-out to Matt Westgate of Lullabot, who I met earlier this year at DrupalCorn Camp in Des Moines for casually referring to what I’ve been doing for the past five years as the “Drupal Recording Initiative.” It was yet another one of those moments when I realized that what I’ve been doing for the past five years is much bigger than just me.

Let’s recap

What began in 2013 as an effort to record sessions for camps in my local community became a passion project and a way for me to help a handful of other camps record their sessions. That was the extent of my “vision” for this project.

With the advent of Slack (specifically the Drupal Camp Organizer Slack) it became even easier for camp organizers to discover and request my services, and suddenly I was recording a dozen-plus camps per year. With more robust documentation and enough inventory to ship kits to camps that I can’t attend, I have nearly complete coverage of camps in North America.

Turning point

With this year’s milestones, funding, and recognition (and all the publicity that came with it), conversations at camps turned more toward “what’s next” and “how do you grow this” rather than “how and why.”

As I started to think along the lines of future-proofing, open-sourcing, and growth, I’ve made some steps in the right direction this past year (again with no goals or plan):

  • Tweaks to the kits and troubleshooting for more reliable recording
  • Docs moved from Google Drive to GitHub for discoverability and collaboration
  • A growing contributor list: basically anyone who has helped me in the trenches, expressed interest in doing so, recorded sessions with my equipment, or has their own set of equipment based on my setup.
The initiative

This is rough, and the point of this post is to get input from the community.

Purpose

Make the recording of Drupal and related talks at camps, summits, meetups (essentially anything outside of DrupalCon) easy, turn-key, affordable, and available.

Roadmap

The first five years evolved organically and overall successfully, but without a plan. Following are the goals for the next five years.

Training and mentorship

I’ve proven that I can do this and do it very well. My goal is to spend an increasing amount of time teaching and supporting others to repeat my success.

  • Improved camp support – I already offer support for a few camps, primarily via email and Slack before and during events; need to schedule more check-ins during events
  • Post event – Schedule post-event calls to debrief and discover gaps in documentation and training
  • Ongoing solicitation for contributors – Identify and somehow organize a group of people that can manage the recordings whether I’m on-site or not to continually spread the knowledge and coverage; the goal here is one new contact per event

Expanded coverage

While it sounds impressive to directly or indirectly record nearly all North American camps, it’s not enough. There are many more events than just camps, and there’s so much more to the world than North America.

  • Shipping kits – I’ve shipped to two events in 2018. The goal is to double that each successive year until there is sufficient global coverage of camps and larger events; smaller events like meetups would need to be covered by local equipment hubs, detailed below
  • Funding for equipment hubs – Navigating customs can prove tricky and equipment hubs within countries or regions would mitigate that risk; possible sources include crowdsourced funding or the Drupal Community Cultivation Grant (more info on this toward the end of the post)
  • Expanding beyond Drupal events – An early goal was to make a device-agnostic recording solution, so it only makes sense that it also should be content-agnostic; primary focus would be adjunct communities: WordPress, PHP, Symfony, Javascript, etc.

Improved documentation

Moving existing docs to GitHub was a step in the right direction. More visuals are needed.

  • Record a setup video
  • Record a troubleshooting video
  • Add more pics and diagrams
  • Create docs for speakers: what to bring, what to do, what not to do, how to optimize your laptop for presenting, etc.
  • Create docs for organizers: what is provided, what is needed from the venue or camp, what speakers need to know, what room monitors need to know, etc.

Higher recording success rate

In 2018, the capture rate based on my documentation and remote support was 80-85% versus 92-100% when I was on the scene. The goal here is to bridge that gap.

  • Better pre-event support and onboarding
  • Broader support for USB-c
  • Improved coverage for non-Mac audio issues
  • Better prep for volunteers and contributors for on-site coverage

Streamlined funding

This is an expensive endeavor, and I couldn’t do it without camp donations, and reimbursements for airfare and lodging. At the same time, incidental costs (food, entertainment, commuting, etc.) add up, and the current model limits me geographically.

  • Charge a flat fee of $1,000 per event (airfare and lodging typically runs $350-$750)
  • Roll surplus funds into new equipment and subsidized travel to events outside North America
  • Maintain existing crowdfunded campaign to cover personal costs (whether current GoFundMe or an alternate)
  • Potentially seek sponsorship or Drupal Community Cultivation Grant (again, see below)

Overall organization

This is definitely the squishiest part of the roadmap. Solo, this is easy to manage. But to grow, I need tools and I don’t know the best ones for this type of work.

  • Contacts – What is the best way to manage the growing list of contributors, as well as give recognition?
  • Scheduling – There should be an event calendar with ways to sign up as a recording volunteer
  • Accounting – There should be an open way to manage the funds
  • Outreach – What is the best way to publicly and continually reach out for new contributors, and to grow the base of recorded events?
  • Inventory – We need a managed inventory of equipment, who controls them, whether they are available or on loan, their condition, etc.
  • Options:

    1. Drupal.org community project
    2. GitHub project board
    3. Slack channel (they already exist in Drupal Slack and the Organizer Slack)
    4. Public meetings
    5. Other / All of the above

Content discoverability

I am not the only person recording sessions, but I am definitely the loudest and most prolific. But my tweets and siloed camp YouTube channels are not optimal for the greater community to find content. I don’t have the bandwidth to do this personally, but would contribute.

  • We absolutely need a Drupal equivalent to Wordpress.tv (several solutions are in the works, but that also has been the state of things for years)
  • Aggregate content from all events, including DrupalCon
  • Include curation and content authors to annotate various versions of the same talk, or even unpublish older or largely repetitive versions
  • Add tagging for searchability
  • Add captioning for accessibility
  • Promote local events as well as the Drupal project
Wrapping it up

I’m continually thanked and reminded of how important what I’m doing is for the community. At the same time, it’s hard for me because it feels pretty routine and relatively easy (aside from the unknowns that come with each venue setup, and the hourly hustle to connect presenters and confirm recordings). Yet recorded content is also how I first learned Drupal and it’s the very reason I began this effort.

So the next stage of this unexpected, unplanned success is to create a structure to prevent my own burnout and prevent this initiative from hitting a plateau. If you want to participate, hit me up on Drupal.org, Twitter, or send me an email.

Lastly, for those who don’t know, the Drupal Community Cultivation Grant exists for things like this. I am very grateful for the Drupal Association offering me a grant to support this program, though I hadn't yet applied (I should have and you should too).

Categories:

Drupal blog: Plan for Drupal 9

Planet Drupal - Wed, 2018/12/12 - 7:38pm

This blog has been re-posted and edited with permission from Dries Buytaert's blog. Please leave your comments on the original post.

At Drupal Europe, I announced that Drupal 9 will be released in 2020. Although I explained why we plan to release in 2020, I wasn't very specific about when we plan to release Drupal 9 in 2020. Given that 2020 is less than thirteen months away (gasp!), it's time to be more specific.

Shifting Drupal's six month release cycle

We shifted Drupal 8's minor release windows so we can adopt Symfony's releases faster.

Before I talk about the Drupal 9 release date, I want to explain another change we made, which has a minor impact on the Drupal 9 release date.

As announced over two years ago, Drupal 8 adopted a 6-month release cycle (two releases a year). Symfony, a PHP framework which Drupal depends on, uses a similar release schedule. Unfortunately the timing of Drupal's releases has historically occurred 1-2 months before Symfony's releases, which forces us to wait six months to adopt the latest Symfony release. To be able to adopt the latest Symfony releases faster, we are moving Drupal's minor releases to June and December. This will allow us to adopt the latest Symfony releases within one month. For example, Drupal 8.8.0 is now scheduled for December 2019.

We hope to release Drupal 9 on June 3, 2020

Drupal 8's biggest dependency is Symfony 3, which has an end-of-life date in November 2021. This means that after November 2021, security bugs in Symfony 3 will not get fixed. Therefore, we have to end-of-life Drupal 8 no later than November 2021. Or put differently, by November 2021, everyone should be on Drupal 9.

Working backwards from November 2021, we'd like to give site owners at least one year to upgrade from Drupal 8 to Drupal 9. While we could release Drupal 9 in December 2020, we decided it was better to try to release Drupal 9 on June 3, 2020. This gives site owners 18 months to upgrade. Plus, it also gives the Drupal core contributors an extra buffer in case we can't finish Drupal 9 in time for a summer release.

Planned Drupal 8 and 9 minor release dates.

We are building Drupal 9 in Drupal 8

Instead of working on Drupal 9 in a separate codebase, we are building Drupal 9 in Drupal 8. This means that we are adding new functionality as backwards-compatible code and experimental features. Once the code becomes stable, we deprecate any old functionality.

Let's look at an example. As mentioned, Drupal 8 currently depends on Symfony 3. Our plan is to release Drupal 9 with Symfony 4 or 5. Symfony 5's release is less than one year away, while Symfony 4 was released a year ago. Ideally Drupal 9 would ship with Symfony 5, both for the latest Symfony improvements and for longer support. However, Symfony 5 hasn't been released yet, so we don't know the scope of its changes, and we will have limited time to try to adopt it before Symfony 3's end-of-life.

We are currently working on making it possible to run Drupal 8 with Symfony 4 (without requiring it). Supporting Symfony 4 is a valuable stepping stone to Symfony 5 as it brings new capabilities for sites that choose to use it, and it eases the amount of Symfony 5 upgrade work to do for Drupal core developers. In the end, our goal is for Drupal 8 to work with Symfony 3, 4 or 5 so we can identify and fix any issues before we start requiring Symfony 4 or 5 in Drupal 9.

Another example is our support for reusable media. Drupal 8.0.0 launched without a media library. We are currently working on adding a media library to Drupal 8 so content authors can select pre-existing media from a library and easily embed them in their posts. Once the media library becomes stable, we can deprecate the use of the old file upload functionality and make the new media library the default experience.

The upgrade to Drupal 9 will be easy

Because we are building Drupal 9 in Drupal 8, the technology in Drupal 9 will have been battle-tested in Drupal 8.

For Drupal core contributors, this means that we have a limited set of tasks to do in Drupal 9 itself before we can release it. Releasing Drupal 9 will only depend on removing deprecated functionality and upgrading Drupal's dependencies, such as Symfony. This will make the release timing more predictable and the release quality more robust.

For contributed module authors, it means they already have the new technology at their service, so they can work on Drupal 9 compatibility earlier (e.g. they can start updating their media modules to use the new media library before Drupal 9 is released). Finally, their Drupal 8 know-how will remain highly relevant in Drupal 9, as there will not be a dramatic change in how Drupal is built.

But most importantly, for Drupal site owners, this means that it should be much easier to upgrade to Drupal 9 than it was to upgrade to Drupal 8. Drupal 9 will simply be the last version of Drupal 8, with its deprecations removed. This means we will not introduce new, backwards-compatibility breaking APIs or features in Drupal 9 except for our dependency updates. As long as modules and themes stay up-to-date with the latest Drupal 8 APIs, the upgrade to Drupal 9 should be easy. Therefore, we believe that a 12- to 18-month upgrade period should suffice.

So what is the big deal about Drupal 9, then?

The big deal about Drupal 9 is … that it should not be a big deal. The best way to be ready for Drupal 9 is to keep up with Drupal 8 updates. Make sure you are not using deprecated modules and APIs, and where possible, use the latest versions of dependencies. If you do that, your upgrade experience will be smooth, and that is a big deal for us.

Special thanks to Gábor Hojtsy (Acquia), Angie Byron (Acquia), xjm(Acquia), and catch for their input in this blog post.

Categories:

Jacob Rockowitz: Managing Drupal and Webform configuration

Planet Drupal - Wed, 2018/12/12 - 6:58pm

Webforms in Drupal 8 are configuration entities, which means that they are exportable to YAML files and this makes it easy to transfer a webform from one server environment to another. Generally, anything that defines functionality or behavior in Drupal 8 is stored as simple configuration or a configuration entity. For example, fields, views, and roles are stored as configuration entities. Things that are considered 'content' are stored in the database as content entities. Content entities include nodes, comments, taxonomy terms, users, and also webform submissions.

Managing Configuration

The core concept behind Drupal's configuration management is you can export and import how a website is configured (aka how it works) from one environment to another environment. For example, we might want to copy the configuration from your staging server to your production server. Drupal 8 has initially taken the approach that all configuration from one environment needs to be moved to the new environment. The problem is that…

In the case of webforms and blocks, this is a major issue because site builders are continually updating these config entities on a production website. The Drupal community is aware of this problem - they have provided some solutions and are actively working to fix this challenge/issue in a future release of Drupal.

Improving Configuration Management

Dries Buytaert shared how we are improving Drupal's configuration management system and he notes that the currently recommended solutions are the Config Filter, Configuration Split, and Config Ignore modules.

Below is a summary...Read More

Categories: