Feeds
Armin Ronacher: Serendipity
KUnifiedPush 1.0.0 is out!
KUnifiedPush provides push notifications for KDE applications. Push notifications are a mechanism to support applications that occasionally need to receive some kind of information from their server-side part, and where receiving in a timely manner matters. Chat applications or weather and emergency alerts would be examples for that. More technical details about KUnifiedPush are available on Volker's introduction post about KUnifiedPush.
KUnifiedPush provides three possible provider backends for push notifications:
The default provider of KUnifiedPush is Ntfy with unifiedpush.kde.org but you can change it to your own server in the System Settings.
Currently both NeoChat and Tokodon integrates with KUnifiedPush as both Matrix and Mastodon support UnifiedPush. There is also ongoing work for weather and emergency alerts.
Packager SectionYou can find the package on download.kde.org and it has been signed with Carl Schwan's GPG key.
KDE Gear 24.12 release schedule
https://community.kde.org/Schedules/KDE_Gear_24.12_Schedule
Dependency freeze is in around 3 weeks (November 7) and feature freeze one
after that. Get your stuff ready!
Dries Buytaert: My solar-powered and self-hosted website
I'm excited to share an experiment I've been working on: a solar-powered, self-hosted website running on a Raspberry Pi. The website at https://solar.dri.es is powered entirely by a solar panel and battery on our roof deck in Boston.
My solar panel and Raspberry Pi Zero 2 are set up on our rooftop deck for testing. Once it works, it will be mounted properly and permanently.By visiting https://solar.dri.es, you can dive into all the technical details and lessons learned – from hardware setup to networking configuration and custom monitoring.
As the content on this solar-powered site is likely to evolve or might even disappear over time, I've included the full article below (with minor edits) to ensure that this information is preserved.
Finally, you can view the real-time status of my solar setup on my solar panel dashboard, hosted on my main website. This dashboard stays online even when my solar-powered setup goes offline.
BackgroundFor over two decades, I've been deeply involved in web development. I've worked on everything from simple websites to building and managing some of the internet's largest websites. I've helped create a hosting business that uses thousands of EC2 instances, handling billions of page views every month. This platform includes the latest technology: cloud-native architecture, Kubernetes orchestration, auto-scaling, smart traffic routing, geographic failover, self-healing, and more.
This project is the complete opposite. It's a hobby project focused on sustainable, solar-powered self-hosting. The goal is to use the smallest, most energy-efficient setup possible, even if it means the website goes offline sometimes. Yes, this site may go down on cloudy or cold days. But don't worry! When the sun comes out, the website will be back up, powered by sunshine.
My primary website, https://dri.es, is reliably hosted on Acquia, and I'm very happy with it. However, if this solar-powered setup proves stable and efficient, I might consider moving some content to solar hosting. For instance, I could keep the most important pages on traditional hosting while transferring less essential content – like my 10,000 photos – to a solar-powered server.
Why am I doing this?This project is driven by my curiosity about making websites and web hosting more environmentally friendly, even on a small scale. It's also a chance to explore a local-first approach: to show that hosting a personal website on your own internet connection at home can often be enough for small sites. This aligns with my commitment to both the Open Web and the IndieWeb.
At its heart, this project is about learning and contributing to a conversation on a greener, local-first future for the web. Inspired by solar-powered sites like LowTech Magazine, I hope to spark similar ideas in others. If this experiment inspires even one person in the web community to rethink hosting and sustainability, I'll consider it a success.
Solar panel and batteryThe heart of my solar setup is a 50-watt panel from Voltaic, which captures solar energy and delivers 12-volt output. I store the excess power in an 18 amp-hour Lithium Iron Phosphate (LFP or LiFePO4) battery, also from Voltaic.
A solar panel being tested on the floor in our laundry room. Upon connecting it, it started charging a battery right away. It feels truly magical. Of course, it won't stay in the laundry room forever so stay tuned for more ...I'll never forget the first time I plugged in the solar panel – it felt like pure magic. Seeing the battery spring to life, powered entirely by sunlight, was an exhilarating moment that is hard to put into words. And yes, all this electrifying excitement happened right in our laundry room.
A 18Ah LFP battery from Voltaic, featuring a waterproof design and integrated MPPT charge controller. The battery is large and heavy, weighing 3kg (6.6lbs), but it can power a Raspberry Pi for days.Voltaic's battery system includes a built-in charge controller with Maximum Power Point Tracking (MPPT) technology, which regulates the solar panel's output to optimize battery charging. In addition, the MPPT controller protects the battery from overcharging, extreme temperatures, and short circuits.
A key feature of the charge controller is its ability to stop charging when temperatures fall below 0°C (32°F). This preserves battery health, as charging in freezing conditions can damage the battery cells. As I'll discuss in the Next steps section, this safeguard complicates year-round operation in Boston's harsh winters. I'll likely need a battery that can charge in colder temperatures.
The 12V to 5V voltage converter used to convert the 12V output from the solar panel to 5V for the Raspberry Pi.I also encountered a voltage mismatch between the 12-volt solar panel output and the Raspberry Pi's 5-volt input requirement. Fortunately, this problem had a more straightforward solution. I solved this using a buck converter to step down the voltage. While this conversion introduces some energy loss, it allows me to use a more powerful solar panel.
Raspberry Pi modelsThis website is currently hosted on a Raspberry Pi Zero 2 W. The main reason for choosing the Raspberry Pi Zero 2 W is its energy efficiency. Consuming just 0.4 watts at idle and up to 1.3 watts under load, it can run on my battery for about a week. This decision is supported by a mathematical uptime model, detailed in Appendix 1.
That said, the Raspberry Pi Zero 2 W has limitations. Despite its quad-core 1 GHz processor and 512 MB of RAM, it may still struggle with handling heavier website traffic. For this reason, I also considered the Raspberry Pi 4. With its 1.5 GHz quad-core ARM processor and 4 GB of RAM, the Raspberry Pi 4 can handle more traffic. However, this added performance comes at a cost: the Pi 4 consumes roughly five times the power of the Zero 2 W. As shown in Appendix 2, my 50W solar panel and 18Ah battery setup are likely insufficient to power the Raspberry Pi 4 through Boston's winter.
With a single-page website now live on https://solar.dri.es, I'm actively monitoring the real-world performance and uptime of a solar-powered Raspberry Pi Zero 2 W. For now, I'm using the lightest setup that I have available and will upgrade only when needed.
NetworkingThe Raspberry Pi's built-in Wi-Fi is perfect for our outdoor setup. It wirelessly connects to our home network, so no extra wiring was needed.
I want to call out that my router and Wi-Fi network are not solar-powered; they rely on my existing network setup and conventional power sources. So while the web server itself runs on solar power, other parts of the delivery chain still depend on traditional energy.
Running this website on my home internet connection also means that if my ISP or networking equipment goes down, so does the website – there is no failover in place.
For security reasons, I isolated the Raspberry Pi in its own Virtual Local Area Network (VLAN). This ensures that even if the Pi is compromised, the rest of our home network remains protected.
To make the solar-powered website accessible from the internet, I configured port forwarding on our router. This directs incoming web traffic on port 80 (HTTP) and port 443 (HTTPS) to the Raspberry Pi, enabling external access to the site.
One small challenge was the dynamic nature of our IP address. ISPs typically do not assign fixed IP addresses, meaning our IP address changes from time to time. To keep the website accessible despite these IP address changes, I wrote a small script that looks up our public IP address and updates the DNS record for solar.dri.es on Cloudflare. This script runs every 10 minutes via a cron job.
I use Cloudflare's DNS proxy, which handles DNS and offers basic DDoS protection. However, I do not use Cloudflare's caching or CDN features, as that would somewhat defeat the purpose of running this website on solar power and keeping it local-first.
The Raspberry Pi uses Caddy as its web server, which automatically obtains SSL certificates from Let's Encrypt. This setup ensures secure, encrypted HTTP connections to the website.
Monitoring and dashboard The Raspberry Pi 4 (on the left) can run a website, while the RS485 CAN HAT (on the right) will communicate with the charge controller for the solar panel and battery.One key feature that influenced my decision to go with the Voltaic battery is its RS485 interface for the charge controller. This allowed me to add an RS485 CAN HAT (Hardware Attached on Top) to the Raspberry Pi, enabling communication with the charge controller using the Modbus protocol. In turn, this enabled me to programmatically gather real-time data on the solar panel's output and battery's status.
I collect data such as battery capacity, power output, temperature, uptime, and more. I send this data to my main website via a web service API, where it's displayed on a dashboard. This setup ensures that key information remains accessible, even if the Raspberry Pi goes offline.
My main website runs on Drupal. The dashboard is powered by a custom module I developed. This module adds a web service endpoint to handle authentication, validate incoming JSON data, and store it in a MariaDB database table. Using the historical data stored in MariaDB, the module generates Scalable Vector Graphics (SVGs) for the dashboard graphs. For more details, check out my post on building a temperature and humidity monitor, which explains a similar setup in much more detail. Sure, I could have used a tool like Grafana, but sometimes building it yourself is part of the fun.
A Raspberry Pi 4 with an attached RS485 CAN HAT module is being installed in a waterproof enclosure.For more details on the charge controller and some of the issues I've observed, please refer to Appendix 3.
Energy use, cost savings, and environmental impactWhen I started this solar-powered website project, I wasn't trying to revolutionize sustainable computing or drastically cut my electricity bill. I was driven by curiosity, a desire to have fun, and a hope that my journey might inspire others to explore local-first or solar-powered hosting.
That said, let's break down the energy consumption and cost savings to get a better sense of the project's impact.
The tiny Raspberry Pi Zero 2 W at the heart of this project uses just 1 Watt on average. This translates to 0.024 kWh daily (1W * 24h / 1000 = 0.024 kWh) and approximately 9 kWh annually (0.024 kWh * 365 days = 8.76 kWh). The cost savings? Looking at our last electricity bill, we pay an average of $0.325 per kWh in Boston. This means the savings amount to $2.85 USD per year (8.76 kWh * $0.325/kWh = $2.85). Not exactly something to write home about.
The environmental impact is similarly modest. Saving 9 kWh per year reduces CO2 emissions by roughly 4 kg, which is about the same as driving 16 kilometers (10 miles) by car.
There are two ways to interpret these numbers. The pessimist might say that the impact of my solar setup is negligible, and they wouldn't be wrong. Offsetting the energy use of a Raspberry Pi Zero 2, which only draws 1 Watt, will never be game-changing. The $2.85 USD saved annually won't come close to covering the cost of the solar panel and battery. In terms of efficiency, this setup isn't a win.
But the optimist in me sees it differently. When you compare my solar-powered setup to traditional website hosting, a more compelling case emerges. Using a low-power Raspberry Pi to host a basic website, rather than large servers in energy-hungry data centers, can greatly cut down on both expenses and environmental impact. Consider this: a Raspberry Pi Zero 2 W costs just $15 USD, and I can power it with main power for only $0.50 USD a month. In contrast, traditional hosting might cost around $20 USD a month. Viewed this way, my setup is both more sustainable and economical, showing some merit.
Lastly, it's also important to remember that solar power isn't just about saving money or cutting emissions. In remote areas without grid access or during disaster relief, solar can be the only way to keep communication systems running. In a crisis, a small solar setup could make the difference between isolation and staying connected to essential information and support.
Why do so many websites need to stay up?The reason the energy savings from my solar-powered setup won't offset the equipment costs is that the system is intentionally oversized to keep the website running during extended low-light periods. Once the battery reaches full capacity, any excess energy goes to waste. That is unfortunate as that surplus could be used, and using it would help offset more of the hardware costs.
This inefficiency isn't unique to solar setups – it highlights a bigger issue in web hosting: over-provisioning. The web hosting world is full of mostly idle hardware. Web hosting providers often allocate more resources than necessary to ensure high uptime or failover, and this comes at an environmental cost.
One way to make web hosting more eco-friendly is by allowing non-essential websites to experience more downtime, reducing the need to power as much hardware. Of course, many websites are critical and need to stay up 24/7 – my own work with Acquia is dedicated to ensuring essential sites do just that. But for non-critical websites, allowing some downtime could go a long way in conserving energy.
It may seem unconventional, but I believe it's worth considering: many websites, mine included, aren't mission-critical. The world won't end if they occasionally go offline. That is why I like the idea of hosting my 10,000 photos on a solar-powered Raspberry Pi.
And maybe that is the real takeaway from this experiment so far: to question why our websites and hosting solutions have become so resource-intensive and why we're so focused on keeping non-essential websites from going down. Do we really need 99.9% uptime for personal websites? I don't think so.
Perhaps the best way to make the web more sustainable is to accept more downtime for those websites that aren't critical. By embracing occasional downtime and intentionally under-provisioning non-essential websites, we can make the web a greener, more efficient place. The solar panel and battery mounted on our roof deck. Next steps
As I continue this experiment, my biggest challenge is the battery's inability to charge in freezing temperatures. As explained, the battery's charge controller includes a safety feature that prevents charging when the temperature drops below freezing. While the Raspberry Pi Zero 2 W can run on my fully charged battery for about six days, this won't be sufficient for Boston winters, where temperatures often remain below freezing for longer.
With winter approaching, I need a solution to charge my battery in extreme cold. Several options to consider include:
- Adding a battery heating system that uses excess energy during peak sunlight hours.
- Applying insulation, though this alone may not suffice since the battery generates minimal heat.
- Replacing the battery with one that charges at temperatures as low as -20°C (-4°F), such as Lithium Titanate (LTO) or certain AGM lead-acid batteries. However, it's not as simple as swapping it out – my current battery has a built-in charge controller, so I'd likely need to add an external charge controller, which would require rewiring the solar panel and updating my monitoring code.
Each solution has trade-offs in cost, safety, and complexity. I'll need to research the different options carefully to ensure safety and reliability.
The last quarter of the year is filled with travel and other commitments, so I may not have time to implement a fix before freezing temperatures hit. With some luck, the current setup might make it through winter. I'll keep monitoring performance and uptime – and, as mentioned, a bit of downtime is acceptable and even part of the fun! That said, the website may go offline for a few weeks and restart after the harshest part of winter. Meanwhile, I can focus on other aspects of the project.
For example, I plan to expand this single-page site into one with hundreds or even thousands of pages. Here are a few things I'd like to explore:
- Testing Drupal on a Raspberry Pi Zero 2 W: As the founder and project lead of Drupal, my main website runs on Drupal. I'm curious to see if Drupal can actually run on a Raspberry Pi Zero 2 W. The answer might be "probably not", but I'm eager to try.
- Upgrading to a Raspberry Pi 4 or 5: I'd like to experiment with upgrading to a Raspberry Pi 4 or 5, as I know it could run Drupal. As noted in Appendix 2, this might push the limits of my solar panel and battery. There are some optimization options to explore though, like disabling CPU cores, lowering the RAM clock speed, and dynamically adjusting features based on sunlight and battery levels.
- Creating a static version of my site: I'm interested in experimenting with a static version of https://dri.es. A static site doesn't require PHP or MySQL, which would likely reduce resource demands and make it easier to run on a Raspberry Pi Zero 2 W. However, dynamic features like my solar dashboard depend on PHP and MySQL, so I'd potentially need alternative solutions for those. Tools like Tome and QuantCDN offer ways to generate static versions of Drupal sites, but I've never tested these myself. Although I prefer keeping my site dynamic, creating a static version also aligns with my interests in digital preservation and archiving, offering me a chance to delve deeper into these concepts.
Either way, it looks like I'll have some fun ahead. I can explore these ideas from my office while the Raspberry Pi Zero 2 W continues running on the roof deck. I'm open to suggestions and happy to share notes with others interested in similar projects. If you'd like to stay updated on my progress, you can sign up to receive new posts by email or subscribe via RSS. Feel free to email me at dries@buytaert.net. Your ideas, input, and curiosity are always welcome.
Appendix Appendix 1: Sizing a solar panel and battery for a Raspberry Pi Zero 2 WTo keep the Raspberry Pi Zero 2 W running in various weather conditions, we need to estimate the ideal solar panel and battery size. We'll base this on factors like power consumption, available sunlight, and desired uptime.
The Raspberry Pi Zero 2 W is very energy-efficient, consuming only 0.4W at idle and up to 1.3W under load. For simplicity, we'll assume an average power consumption of 1W, which totals 24Wh per day (1W * 24 hours).
We also need to account for energy losses due to inefficiencies in the solar panel, charge controller, battery, and inverter. Assuming a total loss of 30%, our estimated daily energy requirement is 24Wh / 0.7 ≈ 34.3Wh.
In Boston, peak sunlight varies throughout the year, averaging 5-6 hours per day in summer (June-August) and only 2-3 hours per day in winter (December-February). Peak sunlight refers to the strongest, most direct sunlight hours. Basing the design on peak sunlight hours rather than total daylight hours provides a margin of safety.
To produce 34.3Wh in the winter, with only 2 hours of peak sunlight, the solar panel should generate about 17.15W (34.3Wh / 2 hours ≈ 17.15W). As mentioned, my current setup includes a 50W solar panel, which provides well above the estimated 17.15W requirement.
Now, let's look at battery sizing. As explained, I have an 18Ah battery, which provides about 216Wh of capacity (18Ah * 12V = 216Wh). If there were no sunlight at all, this battery could power the Raspberry Pi Zero 2 W for roughly 6 days (216Wh / 34.3Wh per day ≈ 6.3 days), ensuring continuous operation even on snowy winter days.
These estimates suggest that I could halve both my 50W solar panel and 18Ah battery to a 25W panel and a 9Ah battery, and still meet the Raspberry Pi Zero 2 W's power needs during Boston winters. However, I chose the 50W panel and larger battery for flexibility, in case I need to upgrade to a more powerful board with higher energy requirements.
Appendix 2: Sizing a solar panel and battery for a Raspberry Pi 4If I need to switch to a Raspberry Pi 4 to handle increased website traffic, the power requirements will rise significantly. The Raspberry Pi 4 consumes around 3.4W at idle and up to 7.6W under load. For estimation purposes, I'll assume an average consumption of 4.5W, which totals 108Wh per day (4.5W * 24 hours = 108Wh).
Factoring in a 30% loss due to system inefficiencies, the adjusted daily energy requirement increases to approximately 154.3Wh (108Wh / 0.7 ≈ 154.3Wh). To meet this demand during winter, with only 2 hours of peak sunlight, the solar panel would need to produce about 77.15W (154.3Wh / 2 hours ≈ 77.15W).
While some margin of safety is built into my calculations, this likely means my current 50W solar panel and 216Wh battery are insufficient to power a Raspberry Pi 4 during a Boston winter.
For example, with an average power draw of 4.5W, the Raspberry Pi 4 requires 108Wh daily. In winter, if the solar panel generates only 70 to 105Wh per day, there would be a shortfall of 3 to 38Wh each day, which the battery would need to cover. And with no sunlight at all, a fully charged 216Wh battery would keep the system running for about 2 days (216Wh / 108Wh per day ≈ 2 days) before depleting.
To ensure reliable operation, a 100W solar panel, capable of generating enough power with just 2 hours of winter sunlight, paired with a 35Ah battery providing 420Wh, could be better. This setup, roughly double my current capacity, would offer sufficient backup to keep the Raspberry Pi 4 running for 3-4 days without sunlight.
Appendix 3: Observations on the Lumiax charge controllerAs I mentioned earlier, my battery has a built-in charge controller. The brand of the controller is Lumiax, and I can access its data programmatically. While the controller excels at managing charging, its metering capabilities feel less robust. Here are a few observations:
- I reviewed the charge controller's manual to clarify how it defines and measures different currents, but the information provided was insufficient.
- The charge controller allows monitoring of the "solar current" (register 12367). I expected this to measure the current flowing from the solar panel to the charge controller, but it actually measures the current flowing from the charge controller to the battery. In other words, it tracks the "useful current" – the current from the solar panel used to charge the battery or power the load. The problem with this is that when the battery is fully charged, the controller reduces the current from the solar panel to prevent overcharging, even though the panel could produce more. As a result, I can't accurately measure the maximum power output of the solar panel. For example, in full sunlight with a fully charged battery, the calculated power output could be as low as 2W, even though the solar panel is capable of producing 50W.
- The controller also reports the "battery current" (register 12359), which appears to represent the current flowing from the battery to the Raspberry Pi. I believe this to be the case because the "battery current" turns negative at night, indicating discharge.
- Additionally, the controller reports the "load current" (register 12362), which, in my case, consistently reads zero. This is odd because my Raspberry Pi Zero 2 typically draws between 0.1-0.3A. Even with a Raspberry Pi 4, drawing between 0.6-1.3A, the controller still reports 0A. This could be a bug or suggest that the charge controller lacks sufficient accuracy.
- When the battery discharges and the low voltage protection activates, it shuts down the Raspberry Pi as expected. However, if there isn't enough sunlight to recharge the battery within a certain timeframe, the Raspberry Pi does not automatically reboot. Instead, I must perform a manual 'factory reset' of the charge controller. This involves connecting my laptop to the controller – a cumbersome process that requires me to disconnect the Raspberry Pi, open its waterproof enclosure, detach the RS485 hat wires, connect them to a USB-to-RS485 adapter for my laptop, and run a custom Python script. Afterward, I have to reverse the entire process. This procedure can't be performed while traveling as it requires physical access.
- The charge controller has two temperature sensors: one for the environment and one for the controller itself. However, the controller's temperature readings often seem inaccurate. For example, while the environment temperature might correctly register at 24°C, the controller could display a reading as low as 14°C. This seems questionable though there might be an explanation that I'm overlooking.
- The battery's charge and discharge patterns are non-linear, meaning the charge level may drop rapidly at first, then stay steady for hours. For example, I've seen it drop from 100% to 65% within an hour but remain at 65% for over six hours. This is common for LFP batteries due to their voltage characteristics. Some advanced charge controllers use look-up tables, algorithms, or coulomb counting to more accurately predict the state of charge based on the battery type and usage patterns. The Lumiax doesn't support this, but I might be able to implement coulomb counting myself by tracking the current flow to improve charge level estimates.
When buying a solar panel, sometimes it's easier to beg for forgiveness than to ask for permission.
One day, I casually mentioned to my wife, "Oh, by the way, I bought something. It will arrive in a few days."
"What did you buy?", she asked, eyebrow raised.
"A solar panel", I said, trying to sound casual.
"A what?!", she asked again, her voice rising.
Don't worry!", I reassured her. "It's not that big", I said, gesturing with my hands to show a panel about the size of a laptop.
She looked skeptical but didn't push further.
Fast forward to delivery day. As I unboxed it, her eyes widened in surprise. The panel was easily four or five times larger than what I'd shown her. Oops.
The takeaway? Sometimes a little underestimation goes a long way.
Dries Buytaert: Drupal upgrades: tools and workflow
When a new major version of Drupal is released, custom code often requires updates to align with API changes, including the removal of deprecated APIs.
Because I keep forgetting certain aspects of this workflow, I decided to document it for future reference.
Tools overview Tool Interface Functionality Target Audience Upgrade Status module UI in Drupal Identifies deprecated code, hosting environment compatibility, and more Site administrators and developers Drupal Check Command-line Identifies deprecated code Developers, especially during coding and continuous integration (CI) Upgrade Status moduleThe Upgrade Status module assesses a Drupal site's readiness for major version upgrades by checking for deprecated code and other compatibility issues.
Screenshot of a Drupal upgrade status report showing hosting environment compatibility checks.Install the Upgrade Status module like you would install any other Drupal module:
[code bash]$ ddev composer require –dev drupal/upgrade_status[/code]Here, ddev is the tool I prefer for managing my local development environment. composer is a dependency manager for PHP, commonly used to install Drupal modules. The –dev option specifies that the module should be installed as a development requirement, meaning it is necessary for development environments but not installed on production environments.
Enable the Upgrade Status module:
[code bash]$ ddev drush pm-enable upgrade_status[/code]drush stands for "Drupal shell" and is a command-line utility for managing Drupal sites. The command pm:enable (where pm stands for "package manager") is used to enable a module in Drupal.
- After enabling the module, you can access its features by navigating to the Admin > Reports > Upgrade status page at /admin/reports/upgrade-status.
The Upgrade Status module might recommend updating PHP and MySQL, per Drupal's system requirements.
To update the PHP version of DDEV, use the following command:
[code bash]$ ddev config –-php-version 8.3[/code]To upgrade the MySQL version of DDEV and migrate your database content, use the following command:
[code bash]$ ddev debug migrate-database mariadb:10.11[/code]After updating these settings, I restart DDEV and run my PHPUnit tests. Although these tests are integrated into my CI/CD workflow, I also run them locally on my development machine using DDEV for immediate feedback.
Drupal CheckDrupal Check is a command-line tool that scans Drupal projects for deprecated code and compatibility issues.
I always run drupal-check before updating my Drupal site's code and third-party dependencies. This helps ensure there are no compatibility issues with the current codebase before upgrading. I also run drupal-check after the update to identify any new issues introduced by the updated code.
Output of Drupal Check command indicating no deprecated code was found.Installation:
[code bash]$ ddev composer require –dev mglaman/drupal-check[/code]Run Drupal Check from the root of your Drupal installation:
[code bash]$ ./vendor/bin/drupal-check –-memory-limit 500M docroot/modules/custom[/code]I usually have to increase the memory limit, hence the --memory-limit 500M.
In the future, I'd like to evaluate whether using PHPStan directly is simpler. This is a TODO for myself. Drupal Check is essentially a wrapper around PHPStan, offering default configuration such as automatically running at level 2. To achieve the same result with PHPStan, I should be able to simply run:
[code bash]$ php vendor/bin/phpstan analyze -l 2 docroot/modules/custom[/code]Bounteous.com: Elevating Marketing Impact With Drupal
The Drop is Always Moving: For the first time, Drupal 11.1 will natively support OOP hooks! Dating back to the very early years of Drupal, hooks provide powerful ways to modify processing flows, forms, data structures and many other parts of the system...
For the first time, Drupal 11.1 will natively support OOP hooks! Dating back to the very early years of Drupal, hooks provide powerful ways to modify processing flows, forms, data structures and many other parts of the system. Read more at https://www.drupal.org/node/3442349
Design System Updates – Oct 2024
Great news, everyone! We now have all the necessary components to start creating our 16px icon collection. I believe that we can work on the design and creation of these icons simultaneously.
Here are some general guidelines to consider when designing the 16px icons:
- Our objective is to correlate the designs from the 24px collection to the 16px as closely as possible.
- When in doubt, create an icon that resembles or simplifies the 24px shape.
- Use 1px lines for the icons.
- Utilize the IconGrid16 component to guide your design.
- Once the icon is complete, change the icon color to the “ColorScheme-Text” color.
- To export the icon, use the Icon Jetpack plugin.
I am excited to begin working on this project and would be grateful for any designer assistance that is available. If you need help learning Figma, please don’t hesitate to reach out, and we will gladly provide our support.
If you require edit access to the Figma file for the 16px icon collection, please let me know. I believe that collaboration and teamwork will lead to the creation of an amazing set of 16px icons.
Update 2Additionally, I wanted to let you all know that I have made some edits to the color scheme of the graphic. I would appreciate any feedback you may have on the changes, and whether I might have missed anything. Let’s work together to create a cohesive and visually stunning icon collection!
Once our colors are aligned to what we need in the system, we will update the issue recently created for this purpose. This issue is under heavy development and things are changing rapidly. Read with caution and expect updates.
https://invent.kde.org/teams/vdg/issues/-/issues/82
Colors in the design system have changed. I need to correlate the colors to the color labels below. It will look a little off for some time. However, the color variables in Figma are updated to the latest feedback. Just this graphic needs the updates as well. Only the color box is correct, not the color names
Update 3This week we took previously-created Ocean window shadows created by Manuel de la Fuente and integrated them in the design system. The shadow variables in Figma now use Ocean-inspired shadow levels to make them more visible. This is to address feedback on shadows previously being too faint. Hopefully these make a difference.
For some reason that I don't know, the XL shadow looks fainter than the LG shadow, but it seems to be a visual bug in Figma. If you apply the XL shadow, it should appear correctly in your graphics. Update 4Transition to PenPot is on hold for now until we have the community instance created and path manipulation updates are applied. This would impede us from recreating icons the same way we have them right now.
Wim Leers: XB week 21: web standards-powered bug fixes
The typical post-DrupalCon slow week — from travel woes to taking some time off, to passing on learnings to the rest of the team. But there’s still some interesting things to note this week :)
The funny bug fixed with one line of HTMLDuring DrupalCon, Chris “cosmicdreams” Weber spotted a funny edge case we somehow hadn’t triggered during development: after placing a component using Experience Builder (XB), using the component props form to specify values and pressing Enter, no it actually submits the form, because that’s what forms do! But in this case, it is not meant to be submitted: the preview is updated live and saving will (eventually) happen automatically.
I investigated, and discovered the existence of <form method="dialog">, which turns out to be perfect for this use case! 1
Thanks to Chris in particular for not just reporting this, but also persevering in fixing this, including writing his very first end-to-end Cypress test! Along the way, half the team contributed either in-person at DrupalCon (Ben “bnjmnm” Mullins and Bálint “balintbrews” Kléri) or remotely (Jesse “jessebaker” Baker, Utkarsh “utkarsh_33”, Deepak “deepakkm” Mishra and I).
The bug fixed with CSS instead of JS and even net-deleting CSSWhen placing components and causing the height to change, the UI became jumpy — not good.
Gaurav “gauravvvv” started working on a fix, and in reviewing it, Jesse thought of a different approach altogether … the result: a +8,-53 diffstat that deleted a bunch of JS (one entire file even) and replaced it with a pure CSS solution (fit-content) — even one CSS line less than before!
Research mode: continuedI mentioned last week that a bunch of us were in research mode.
Ted “tedbow” Bowman and Travis “traviscarden” Carden continued research on #3475672: auto-saving, drafts, and all possible ways to achieve that — with Travis overhauling that issue summary in such a magnificently structured way that has rarely been seen on drupal.org. It really helps streamline the discussion!
Similarly, Harumi “hooroomoo” Jang, Xinran “xinran” Cao, and Alex “effulgentsia” Bronstein continued iterating on #3475363: in-UI (JS) component creation Proof-of-Concept using StackBlitz — stay tuned for a demo! :D
Missed a prior week? See all posts tagged Experience Builder.
Goal: make it possible to follow high-level progress by reading ~5 minutes/week. I hope this empowers more people to contribute when their unique skills can best be put to use!
For more detail, join the #experience-builder Slack channel. Check out the pinned items at the top!
Assorted clean-up & bugfixes- Deepak lifted all methods out of SdcController2 into separate invokable services-as-controllers, which turned a single massive file into much more easily digestible pieces — and he was able to remove some former controller methods that had lost their usages.
- Shyam “shyam_bhatt” Bhatt, Omkar “omkar-pd” Deshpande and Jesse fixed the zoom buttons behaving erratically.
- … and a bunch more small bugs that are less interesting :)
Week 21 was September 30–October 6, 2024.
-
I love that after all these years there’s still more to discover in the HTML spec, and more affordances are available that others have thoughtfully constructed that I’ve never before needed! ↩︎
-
Originally named just fine, but definitely in need of a rename to prepare for #3454519: [META] Support component types other than SDC! ↩︎
Web Review, Week 2024-42
Let’s go for my web review for the week 2024-42.
You Can’t Make Friends With The RockstarsTags: tech, capitalism, marketing, politics, criticism
People have to realize that tycoons like the ones from big tech companies can both be rich and mediocre. They were smart enough to seize opportunities at the right time but they are not exceptional. In fact, they’re even boring and spineless.
The best quote in this paper I think is: “There is nothing special about Elon Musk, Sam Altman, or Mark Zuckerberg. Accepting that requires you to also accept that the world itself is not one that rewards the remarkable, or the brilliant, or the truly incredible, but those who are able to take advantage of opportunities, which in turn leads to the horrible truth that those who often have the most opportunities are some of the most boring and privileged people alive.”
The real problem is that lots of journalists can’t come to term with the fairy tale and so fall prey to all their publicity stunts as if it had any hidden meaning. This is dangerous because of all the political power they try to seize for their own gains.
Meanwhile, “the most powerful companies enjoy a level of impunity, with their founders asked only the most superficial, softball of questions — and deflecting anything tougher by throwing out dead cats when the situation demands.”
Now you can go and read this long piece.
https://www.wheresyoured.at/rockstars/
Tags: tech, politics
It’s actually unsurprising, all those tech and crypto bros have assets in jeopardy if some regulation is applied to their industry. No wonder they’d support the one with the most libertarian agenda after the current administration which did look into antitrust cases and increased regulation (even though marginally).
Tags: tech, blog, wordpress
Wow, the atmosphere looks fairly toxic at Automattic right now. It felt like it was just about the trademark dispute but clearly the craziness is running much deeper. This is concerning for WordPress future I think.
https://www.404media.co/automattic-buyout-offer-wordpress-matt-mullenweg/
Tags: tech, social-media, rss
Want to put an end to the social media platforms weight on our lives? For once there’s an individual solution which might work. This is a chance because as he rightfully points out individual solutions are generally too complicated to bring systemic change. Here this is actually doable.
Tags: tech, cloudflare, rss
Cloudflare indeed needs to do better to accommodate RSS readers. They’re not malicious bots and shouldn’t be flagged as such.
https://openrss.org/blog/using-cloudflare-on-your-website-could-be-blocking-rss-users
Tags: tech, web, browser, google, privacy, politics
A good reminder that this is not the Google Chrome alternative you’re looking for. It’s the same privacy invading mindset with some bigotry on top.
https://www.spacebar.news/stop-using-brave-browser/
Tags: tech, web, archive, security
It’s a very important project, it’s really concerning that this attack went through. The service is still partly disrupted but they’re showing signs of recovery. Let’s wish them luck and good health. This archival service is essential for knowledge and history preservation on the web.
Tags: tech, ai, machine-learning, gpt, knowledge, criticism, research
Now the impact seems clear and this is mostly bad news. This reduces the production of public knowledge so everyone looses. Ironically it also means less public knowledge available to train new models. At some point their only venue to fine tune their models will be user profiling which will be private… I’ve a hard time seeing how we won’t end up stuck with another surveillance apparatus providing access to models running on outdated knowledge. This will lock so many behaviors and decisions in place.
https://academic.oup.com/pnasnexus/article/3/9/pgae400/7754871#483096365
Tags: tech, ai, machine-learning, neural-networks, gpt, logic, mathematics, research
Of course I recommend reading the actual research paper. This article is a good summary of the consequences though. LLMs definitely can’t be trusted with formal reasoning including basic maths. This is a flaw in the way they are built, the bath forward is likely merging symbolic and sub-symbolic approaches.
https://garymarcus.substack.com/p/llms-dont-do-formal-reasoning-and
Tags: tech, python, security, supply-chain, developer-experience
It’s tempting to use uv. It’s probably fine on the developer workstation at this point. It looks a bit early to use it in production though, it’s a bit young for that and carries questions regarding supply chain security still.
https://pythonspeed.com/articles/uv-python-production/
Tags: tech, raspberry-pi, gpu, gaming
Definitely a funny hack. Not usable for compute workloads though.
https://www.jeffgeerling.com/blog/2024/use-external-gpu-on-raspberry-pi-5-4k-gaming
Tags: tech, rust, embedded
Nice article showing the steps to port Rust code to run on deeply embedded systems. It highlights the difficulties and caveats of such a task.
https://towardsdatascience.com/nine-rules-for-running-rust-on-embedded-systems-b0c247ee877e
Tags: tech, programming, optimization
This is too often overlooked, but table lookups can help with performance if done well.
https://lemire.me/blog/2024/10/14/table-lookups-are-efficient/
Tags: tech, graphics, 2d, hardware
Nice graphic tricks when the hardware was harder to work with. It’s amazing how much we could fit back then out of sheer motivation.
https://arnaud-carre.github.io/2024-09-08-4ktribute/
Tags: tech, web, collaborative, crdt
Excellent introduction to sync engines and how they work. The concept is indeed coming from the gaming industry and we see it more in web applications nowadays due to the user demands for working offline and real time collaboration.
Tags: tech, programming, logic, ai, prolog
Finally a path forward for logic programming? An opportunity to evolve beyond Prolog and its variants? Good food for thought.
https://www-ps.informatik.uni-kiel.de/~mh/papers/WLP24.pdf
Tags: tech, reliability, tests, debugging
This is an important trait to have for a developer. If you’re content of things working without knowing why and how they work, you’re looking for a world of pain later.
https://buttondown.com/hillelwayne/archive/be-suspicious-of-success/
Tags: tech, architecture, complexity, developer-experience, vendor-lockin
I tend to side on the “boring tech” side, but indeed this is a good reminder that what we want is finding the right balance.
https://yonkeltron.com/posts/boring-tech-is-stifling-improvement/
Tags: tech, complexity, cognition, design, architecture
Definitely this. Our cognitive capacity is limited, we’d better not deplete it due to complexity before we even reach the core of the problem at hand.
https://minds.md/zakirullin/cognitive#full
Tags: tech, philosophy
Our craft is based on shifting sands. This brings interesting philosophical questions, like why do it at all? I think the answer proposed in this short article is spot on. It can help bring new ideas on how to be in the world. This is more important than the code itself.
https://sean.voisen.org/blog/why-make-software
Tags: train, ecology, politics
This is definitely a good idea, I wish we had the same in France. This is too bad that they plan to raise the price, it’s going to limit the impact of the measure.
Bye for now!
Real Python: The Real Python Podcast – Episode #224: Narwhals: Expanding DataFrame Compatibility Between Libraries
How does a Python tool support all types of DataFrames and their various features? Could a lightweight library be used to add compatibility for newer formats like Polars or PyArrow? This week on the show, we speak with Marco Gorelli about his project, Narwhals.
[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
Qt Creator 15 Beta released
We are happy to announce the release of Qt Creator 15 Beta!
Limit Application Memory Usage with systemd
I saw this question on KDE forum about how to limit memory usage of a specific application in KDE, using systemd specifically. I did some research on that.
Resource control in systemdman systemd.resource-control lists plenty of options that we can set to a cgroup. E.g., to limit the memory usage of a service, we can add:
MemoryAccounting=yes MemoryHigh=2Gunder the [Service] section of its .service file.
The difference between this and ulimit is that ulimit is per process, while systemd resource control is per cgroup. I.e., the MemoryHigh is accounted to the sum of both the service process, and all sub-processes it spawns, and even detached processes, i.e., daemons.
(That's actually the main point of cgroup: a process tree that a process can't escape via double-forking / daemonizing.)
Apps as systemd servicesKDE Plasma launches apps as systemd services. (See this doc and this blog for more details.)
We can find the name of the systemd service of an app like this:
$ systemd-cgls|grep konsole │ │ │ ├─app-org.kde.konsole@0d82cb37fcd64fe4a8b7cf925d86842f.service │ │ │ │ ├─35275 /usr/bin/konsole │ │ │ │ └─35471 grep --color=auto konsoleBut the problem is:
- The part of the name after @ is a random string, changes every time the app is launched.
- The service is generated dynamically:
So if we want to limit the memory usage of Konsole, there's no persistent .service file on disk that we can edit.
Luckily, systemd allows us to create drop-in files to partially modify a service. Also, systemd considers app-org.kde.konsole@0d82cb37fcd64fe4a8b7cf925d86842f.service to be instances of a template named app-org.kde.konsole@.service. (This is how things like getty@tty3.service work.) So we can create a drop-in file named ~/.config/systemd/user/app-org.kde.konsole@.service.d/override.conf with the content:
[Service] MemoryAccounting=yes MemoryHigh=2Gand it will apply to all instances of app-org.kde.konsole@.service, even if there's no service file with that name.
(The file doesn't have to be named "override.conf". Any name with .conf works.)
Then we need to reload the systemd user manager: systemctl --user daemon-reload.
Now we can launch Konsole, and check if the memory limit works:
$ systemctl --user show 'app-org.kde.konsole@*.service'|grep MemoryHigh= EffectiveMemoryHigh=2147483648 MemoryHigh=2147483648 StartupMemoryHigh=infinityNote: as explained above, the limit applies to the sum of Konsole and all processes it spawns. E.g., if we run kwrite in Konsole, the memory usage of kwrite will be accounted to the limit of Konsole, and the limit we set to KWrite won't apply.
Set defaults for all appsWe can put defaults in ~/.config/systemd/user/app-.service.d/override.conf, and it will match all services whose name starts with app-.
Alternatively, if we run systemd-cgls, we can see that all apps are under a node named app.slice. So we can also put defaults in ~/.config/systemd/user/app.slice.d/override.conf, and all apps will inherit the settings. However, this is different from the previous method, as user services are also under app.slice by default, so they will also inherit the settings.
Drupal Starshot blog: Presenting the Drupal CMS v1 content strategy
Since Drupal CMS aims to enable marketers to build and launch websites quickly, Drupal CMS must offer out-of-the-box content types that are common across our target markets.
This underlying content structure is the content model. Leading up to DrupalCon Barcelona, the Starshot leadership team highlighted the need to create an initial content strategy for Drupal CMS. Through collaboration with the Drupal CMS UX steering committee, we have developed a content model for the initial release.
These content types will be presented during site setup, and will also be available to add later on. As with much of Drupal CMS, the content types are powered by ‘recipes’, which will provide some additional functionality beyond the content types themselves, such as listing pages, menu links and even default content, where appropriate.
Content types for v1All Drupal CMS sites will include a Basic page content type by default, to ensure users are able to create content even if they don't select any of the optional types. In addition to Basic page, the following content types will be available in the initial release:
- Event
- News article
- Blog
- Project
- Case study
- Person profile
For more detailed information about the content types and the process we followed to develop them, please see the full content strategy page.
Cristina Chumillas, the Drupal CMS UX lead, and I would like to thank everyone who contributed to developing this strategy and the accompanying documentation: Megh Plunkett, Emma Horrell, Niklas Missel, Lewis Nyman, Laurens Van Damme and Kat Shaw.
Incorporating SEO into the strategySince we're targeting marketers, we are making search engine optimization a top priority. Drupal CMS will provide an optional recipe for 'SEO tools'. By default, sites will have basic SEO functionality in place, such as default path alias patterns, hreflang metadata, and the ability to manage redirects.
The SEO tools recipe provides additional functionality such as meta tags, XML sitemap, and real-time SEO analysis of content. For meta tag optimization, this recipe also adds three new fields to each content type: SEO title, SEO description and SEO image. Currently these fields are in a collapsed fieldset on the node form, and are optional. If content is not provided, the meta tags will fall back to use the node title, description and featured image.
Drupal CMS content beyond v1Once the initial content types are available, we will work on expanding the content model in the second phase for later releases of Drupal CMS. Adding to the content model iteratively allows us to gather user feedback, make improvements, and adjust accordingly. We have some ideas for which additional content types to include later, and will have a better understanding of what content types are most desired once people begin building Drupal CMS sites.
If you have feedback on this proposed content strategy, or questions about it, please add it to this issue. We are planning to test this with users from the Drupal CMS target personas, and will provide more information when the testing is confirmed.
Static builds of KDE Frameworks and KDE applications
Being able to build our libraries and applications statically has been on the wishlist since a long time, and recently we made some progress in that direction again. Similar to the the recent Android integration improvements this is also a direct result of Akademy.
Reviving the CIWe had CI for static builds during the 5 era, but we lost quite a bit of coverage there during the transition to 6. However, what we had were only static builds of KDE libraries against a shared build of Qt. That finds some but by far not all issues related to static builds.
The new setup now actually has a static Qt6 to build against, which forces us to also sort out plugin integration correctly. While it is a bit more work to get back to the desired level of CI coverage that way, we will get to a much better result.
Static initialization and CMakeA common problem in static builds is initialization code that is run on shared library loading, as that simply doesn’t exist in static builds, and unless you know what you are doing that will usually be silently dropped from the final executable.
This has previously resulted in various workarounds, such as explicit initialization API (e.g. Q_INIT_RESOURCE), which is easy to miss. With Qt 6 having switched to CMake there’s a new option now though, so-called object libraries. Those are used as implementation details to inject the initialization code from static libraries into the final executable.
From a consumer point of view you get the same behavior as with shared libraries: you link against it and initialization just works. Behind the scenes this adds quite a bit of complexity to the build system though, which then sometimes leaks through into our build system code as well.
The probably most common case is qt6_add_resources. Qt resources need initialization code, so the target they are added to gets another dependency attached to it, an object library containing the initialization.
If that target is a library the additional object libraries need to be installed and exported via CMake as well, to make those available to consumer code (ECM support for this).
add_library(MyLib) qt6_add_resources(MyLib OUTPUT_TARGETS _out_targets ...) install(TARGETS MyLib ${_out_targets} EXPORT MyLibTargets ...)The OUTPUT_TARGETS part is the new thing to add here (example). Another such case are custom shaders (via qt6_add_shaders), and of course QML modules (more on those below).
Installing QML modulesQML modules created with ecm_add_qml_module also produce object library targets, but since that macro works on a higher level we can handle more of that automatically (ECM MR).
ecm_add_qml_module(mymodule URI "org.kde.mymodule" ... INSTALLED_PLUGIN_TARGET KF6::mymodule) ... ecm_finalize_qml_module(mymodule EXPORT KF6MyModuleTargets)The EXPORT argument to ecm_finalize_qml_module is the new part here. This takes care both of installing object libraries as well as exporting the QML module itself in the installed CMake configuration file. The latter is needed for statically linking QML modules into the application (see below).
When using ALIAS targets (common in KDE Frameworks, much less common outside of that) we also might need the INSTALLED_PLUGIN_TARGET argument for ecm_add_qml_module(), to make sure the module target name matches the installed alias. That’s not new, but has little practical impact outside of static linking so we have been a bit sloppy there (simple example, complex example).
Importing QML modulesLinking QML modules into the final application binary got a lot easier with Qt 6, thanks to qmlimportscanner.
add_executable(myapp) ecm_add_qml_module(myapp ...) ... if (NOT QT6_IS_SHARED_LIBS_BUILD) qt6_import_qml_plugins(myapp) endif()That’s it, no more Q_IMPORT_PLUGIN macros or manually listing modules to link. There’s less tolerance for mistakes in the CMake and QML module metadata now though, anything incomplete or missing there will give you linker failures or module loading errors at runtime (see also ECM support for additional import search paths).
OutlookSince Akademy almost 50 merge requests have been submitted related to this, most of which have already been integrated. With all that it’s possible now to build and run Alligator against a static Qt. Alligator was the first milestone we had picked for this during Akademy, due to being sufficiently complex to prove the viability of all this while not having dependencies that could be adding significant additional challenges on their own (such as the multimedia stack).
However, this work has been mostly done on desktop Linux, and while that allowed for rapid progress it’s the platform this is least interesting for. We yet have to reproduce this on Android, where it probably should bring the most immediate benefit, and of course this removes a big obstacle for potential support of iOS (as Qt can only be linked statically there).
Programiz: Python match…case Statement
Matt Layman: Epic Debugging, Hilarious Outcome - Building SaaS #205
Matt Layman: Epic Debugging, Hilarious Outcome - Building SaaS #205
Reproducible Builds (diffoscope): diffoscope 281 released
The diffoscope maintainers are pleased to announce the release of diffoscope version 281. This version includes the following changes:
[ Chris Lamb ] * Don't try and test with systemd-ukify within Debian stable. [ Jelle van der Waa ] * Add support for UKI files.You find out more by visiting the project homepage.