Feeds

Dries Buytaert: Drupal 11 released

Planet Drupal - Tue, 2024-11-26 15:29

Today is a big day for Drupal as we officially released Drupal 11!

In recent years, we've seen an uptick in innovation in Drupal. Drupal 11 continues this trend with many new and exciting features. You can see an overview of these improvements in the video below:

Drupal 11 has been out for less than six hours, and updating my personal site was my first order of business this morning. I couldn't wait! Dri.es is now running on Drupal 11.

I'm particularly excited about two key features in this release, which I believe are transformative and will likely reshape Drupal in the years ahead:

  1. Recipes (experimental): This feature allows you to add new features to your website by applying a set of predefined configurations.
  2. Single-Directory Components: SDCs simplify front-end development by providing a component-based workflow where all necessary code for each component lives in a single, self-contained directory.

These two new features represent a major shift in how developers and site builders will work with Drupal, setting the stage for even greater improvements in future releases. For example, we'll rely heavily on them in Drupal Starshot.

Drupal 11 is the result of contributions from 1,858 individuals across 590 organizations. These numbers show how strong and healthy Drupal is. Community involvement remains one of Drupal's greatest strengths. Thank you to everyone who contributed to Drupal 11!

Categories: FLOSS Project Planets

PyCoder’s Weekly: Issue #657 (Nov. 26, 2024)

Planet Python - Tue, 2024-11-26 14:30

#657 – NOVEMBER 26, 2024
View in Browser »

NumPy Practical Examples: Useful Techniques

In this tutorial, you’ll learn how to use NumPy by exploring several interesting examples. You’ll read data from a file into an array and analyze structured arrays to perform a reconciliation. You’ll also learn how to quickly chart an analysis and turn a custom function into a vectorized function.
REAL PYTHON

Loop Targets

Loop assignment allows you to assign to a dict item in a for loop. This post covers what that means and that it is no more costly than regular assignment.
NED BATCHELDER

Introducing the Windsurf Editor. Tomorrow’s IDE, Today

Introducing the Windsurf Editor, the first agentic IDE. All the features you know and love from Codeium’s extensions plus new capabilities such as Cascade that act as collaborative AI agents, combining the best of copilot and agent systems →
CODEIUM sponsor

Vector Animations With Python

This post shows you how to use gizeh and moviepy to create animations using vector graphics.
DEEPNOTE.COM

Take the 2024 Django Developers Survey

DJANGO SOFTWARE FOUNDATION

Python 3.14.0 Alpha 2 Released

CPYTHON DEV BLOG

Quiz: Python Dictionary Comprehensions

REAL PYTHON

PyTexas (Austin) Call for Proposals

PYTEXAS.ORG

Quiz: Namespaces and Scope in Python

REAL PYTHON

Articles & Tutorials Avoid Counting in Django Pagination

Django’s Paginator class separates data into chunks to display pages of results. By default, underneath it uses a SQL COUNT(*) call which for large amounts of data can be expensive. This article shows you how to get around that.
NIK TOMAZIC

Working With TOML and Python

TOML is a configuration file format that’s becoming increasingly popular in the Python community. In this video course, you’ll learn the syntax of TOML and explore how you can work with TOML files in your own projects.
REAL PYTHON course

Categories of Leadership on Technical Teams

Understanding the different types of technical leadership can help you better structure teams and define roles and responsibilities. This post talks about the various types of technical leaders you might encounter.
BEN KUHN

Quick Prototyping With Sqlite3

Python’s sqlite3 bindings makes it a great tool for quick prototyping while you’re working out what it is exactly that you’re building and what kind of database schema makes sense.
JUHA-MATTI SANTALA

Django Performance and Optimization

“This document provides an overview of techniques and tools that can help get your Django code running more efficiently - faster, and using fewer system resources.”
DJANGO SOFTWARE FOUNDATION

Python Dependency Management Is a Dumpster Fire

Managing dependencies in Python can be a bit of a challenge. This deep dive article shows you all the problems and how the problems are mitigated if not solved.
NIELS CAUTAERTS

Under the Microscope: Ecco the Dolphin

This article shows the use of Ghidra and Python to reverse engineer the encoding scheme of the Sega Dreamcast game Eccho the Dolphin.
32BITS

Is Python Really That Slow?

The speed of Python is an age old conversation. This post talks about just what it does and doesn’t mean.
MIGUEL GRINBERG

Threads Beat Async/Await

Further musings about async & await and why Armin thinks virtual threads are a better model.
ARMIN RONACHER

Python for R Users

Resources for getting better at Python for experienced R developers
STEPHEN TURNER

Projects & Code nbformat: Base Implementation of Jupyter Notebook Format

PYPI.ORG

pex: Generate Python EXecutable Files, Lock Files and Venvs

GITHUB.COM/PEX-TOOL

Async, Pure-Python Server-Side Rendering Engine

VOLFPETER.GITHUB.IO • Shared by Peter Volf

django-tasks: Background Workers Reference Implementation

GITHUB.COM/REALORANGEONE

dsRAG: Retrieval Engine for Unstructured Data

GITHUB.COM/D-STAR-AI

Events Weekly Real Python Office Hours Q&A (Virtual)

November 27, 2024
REALPYTHON.COM

Python New Zealand: Mapping Hurricane Wind Speeds From Space

November 28, 2024
MEETUP.COM

SPb Python Drinkup

November 28, 2024
MEETUP.COM

PyCon Wroclaw 2024

November 30 to December 1, 2024
PYCONWROCLAW.COM

PyDay Togo 2024

November 30 to December 1, 2024
LU.MA

PyDelhi User Group Meetup

November 30, 2024
MEETUP.COM

Happy Pythoning!
This was PyCoder’s Weekly Issue #657.
View in Browser »

[ Subscribe to 🐍 PyCoder’s Weekly 💌 – Get the best Python news, articles, and tutorials delivered to your inbox once a week >> Click here to learn more ]

Categories: FLOSS Project Planets

MidCamp - Midwest Drupal Camp: Session Submission Now Open for MidCamp 2025!

Planet Drupal - Tue, 2024-11-26 11:25
Session Submission Now Open for MidCamp 2025!

As the season of gratitude approaches, we’re excited to celebrate you—our future speakers! If you've got an idea for a session, now's the time to get involved in MidCamp 2025, happening May 20-22 in Chicago.

Call for Speakers

Since 2014, MidCamp has hosted over 300 amazing sessions, and we’re ready to add your talk to that legacy. We’re seeking presentations for all skill levels—from Drupal beginners to advanced users, as well as end users and business professionals. Check out our session tracks for full details on the types of talks we’re looking for.

Not quite ready? No worries! Join us for one of our Speaker Workshops:

  • December 2024 (TBD): Crafting an Outstanding Proposal
  • March 2025 (TBD): Polishing Your Presentation (Open to both MidCamp and DrupalCon Atlanta 2025 presenters!)
Key Dates
  • Session Proposals Open: November 25, 2024

Submit your session now!

  • Proposal Deadline: January 12, 2025
  • Speakers Notified: Week of February 17, 2025
  • MidCamp Sessions: May 20-21, 2025
P.S.

We hear Florida is also calling for submissions—but let’s be real, we know where your heart lies. 😊

Sponsor MidCamp

Looking to connect with the Drupal community? Sponsoring MidCamp is the way to do it! With packages starting at $600, there are opportunities to suit any budget. Whether you’re recruiting talent, growing your brand, or simply supporting the Drupal ecosystem, MidCamp sponsorship offers great value.

Act early to maximize your exposure!

Learn more about sponsorship opportunities

Stay in the Loop

Don’t miss a beat!

Keep an eye on our news page and subscribe to our newsletter for updates on the venue, travel options, social events, and speaker announcements.

Ready to submit your session? Let’s make MidCamp 2025 unforgettable!

Categories: FLOSS Project Planets

Web Search Keywords

Planet KDE - Tue, 2024-11-26 10:32

Did you know about a small but very useful feature from KDE?

Open krunner via Alt+Space and type qw:KDE to search Qwant for KDE:

Pressing Enter will open up your browser with the specified KDE search on Qwant!

There are a lot of other Web Search Keywords like:

Wikipedia:

Invent:

Translate from English to German on dict.cc

You can find all of them by opening Web Search Keywords on krunner:

Extra: Create your own!

I use often a Fedora tool called COPR, so let’s use it as an example to create our own web search keyword.

Do your search in the webpage:

And now the important part you need:

Now back to the Web Search Keyword settings:

Fill in the data needed taking care of the placeholder for our input!:

Now we have our own Web Search Keyword:

NOTE: for some reason I had to click on Apply and OK until all the different setting windows were closed before the new custom Web Search Keyword worked

And the result:

I hope it’s useful to somebody!

Categories: FLOSS Project Planets

Matt Glaman: Restrict Composer dependency updates to only patch releases

Planet Drupal - Tue, 2024-11-26 09:00

I was doing website maintenance and checked for outdated dependencies with composer outdated. I usually filter with -D for checking direct dependencies and -p for packages with patch releases. These are typically easy pickings. I saw I was on 2.1.3 of the Honeypot module and 2.1.4 was available. So I ran composer update drupal/honeypot. I noticed the module was updated to 2.2.0, because my Composer constraint is drupal/honeypot: ^2.0, allowing minor updates. I figured that was fine enough. Turns out it wasn't.

Categories: FLOSS Project Planets

Real Python: Managing Dependencies With Python Poetry

Planet Python - Tue, 2024-11-26 09:00

When your Python project relies on external packages, you need to make sure you’re using the right version of each package. After an update, a package might not work as it did before. A dependency manager like Python Poetry helps you specify, install, and resolve external packages in your projects. This way, you can be sure that you always work with the correct dependency version on every machine.

In this video course, you’ll learn how to:

  • Create a new project using Poetry
  • Add Poetry to an existing project
  • Configure your project through pyproject.toml
  • Pin your project’s dependency versions
  • Install dependencies from a poetry.lock file
  • Run basic Poetry commands using the Poetry CLI

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Highlights from the Digital Public Goods Alliance Annual Members Meeting 2024

Open Source Initiative - Tue, 2024-11-26 06:45

This month, I had the privilege of representing the Open Source Initiative at the Digital Public Goods Alliance (DPGA) Annual Members Meeting. Held in Singapore, this event marked our second year participating as members, following our first participation in Ethiopia. It was an inspiring gathering of innovators, developers and advocates working to create a thriving ecosystem for Digital Public Goods (DPGs) as part of UNICEF’s initiative to advance the United Nations’ sustainable development goals (SDGs).

Keynote insights

The conference began with an inspiring keynote by Liv Marte Nordhaug, CEO of the DPGA, who made a call for governments and organizations to incorporate:

  1. Open Source first principles
  2. Open data at scale
  3. Interoperable Digital Public Infrastructure (DPI)
  4. DPGs as catalysts for climate change action

These priorities underscored the critical role of DPGs in fostering transparency, accountability and innovation across sectors.

DPGs and Open Source AI

A standout feature of the first day of the event was the Open Source AI track led by Amreen Taneja, DPGA Standards Lead, which encompassed three dynamic sessions:

  1. Toward AI Democratization with Digital Public Goods: This session explored the role of DPGs in contributing to the democratization of AI technologies, including AI use, development and governance, to ensure that everyone benefits from AI technology equally.
  2. Fully Open Public Interest AI with Open Data: This session highlighted the need for open AI infrastructure supported by accessible, high-quality datasets, especially in the global majority countries. Discussions evolved over how open training data sets ought to be licensed to ensure open, public interest AI.
  3. Creating Public Value with Public AI: This session examined real-world applications of generative AI in public services. Governments and NGOs showcased how AI-enabled tools can effectively tackle social challenges, leveraging Open Source solutions within the AI stack.
DPG Product Fair: showcasing innovation

The second day of the event was marked by the DPG Product Fair, which provided a science-fair-style platform for showcasing DPGs. Notable examples included:

  • India’s eGov DIGIT Open Source platform, serving over 1 billion citizens with robust digital infrastructure.
  • Singapore’s Open Government Products, which leverages Open Source and enables open collaboration to allow this small nation to expand their impact together with other southeast Asian nations.

One particularly engaging session was Sarah Espaldon’s “Improving Government Services with Legos.” This presentation from the Singapore government highlighted the benefits of modular DPGs in enhancing service delivery and building flexible DPI capabilities.

Privacy best practices for DPGs

A highlight from the third and final day of the event was the privacy-focused workshop co-hosted by the DPGA and the Open Knowledge Foundation. As privacy becomes a central concern for DPGs, the DPGA introduced its Standard Expert Group to refine privacy requirements and develop best practice guidelines. The interactive session provided invaluable feedback, driving forward the development of robust privacy standards for DPGs.

Looking ahead

The event reaffirmed the potential of Open Source technologies to transform global public goods. As we move forward, the Open Source Initiative is committed to advancing these conversations, including around the Open Source AI Definition and fostering a more inclusive digital ecosystem. A special thanks to the dedicated DPGA team—Liv Marte Nordhaug, Lucy Harris, Max Kintisch, Ricardo Miron, Luciana Amighini, Bolaji Ayodeji, Lea Gimpel, Pelin Hizal Smines, Jon Lloyd, Carol Matos, Amreen Taneja, and Jameson Voisin—whose efforts made this conference a success. We look forward to another year of impactful collaboration and to the Annual Members Meeting next year in Brazil!

Categories: FLOSS Research

Emanuele Rocca: Building Debian packages The Right Way

Planet Debian - Tue, 2024-11-26 03:34

There is more than one way to do it, but it seems that The Right Way to build Debian packages today is using sbuild with the unshare backend. The most common backend before the rise of unshare was schroot.

The official Debian Build Daemons have recently transitioned to using sbuild with unshare, providing a strong motivation to consider making the switch. Additionally the new approach means: (1) no need to configure schroot, and (2) no need to run the build as root.

Here are my notes about moving to the new setup, for future reference and in case they may be useful to others.

First I installed the required packages:

apt install sbuild mmdebstrap uidmap

Then I created the following script to update my chroots every night:

#!/bin/bash for arch in arm64 armhf armel; do HOME=/tmp mmdebstrap --quiet --arch=$arch --include=ca-certificates --variant=buildd unstable \ ~/.cache/sbuild/unstable-$arch.tar http://127.0.0.1:3142/debian done

In the script, I’m calling mmdebstrap with --quiet because I don’t want to get any output on succesful execution. The script is running in cron with email notifications, and I only want to get a message if something goes south. I’m setting HOME=/tmp for a similar reason: the unshare user does not have access to my actual home directory, and by default dpkg tries to use $HOME/.dpkg.cfg as the configuration file. By overriding HOME I avoid the following message to standard error:

dpkg: warning: failed to open configuration file '/home/ema/.dpkg.cfg' for reading: Permission denied

Then I added the following to my sbuild configuration file (~/.sbuildrc):

$chroot_mode = 'unshare'; $unshare_tmpdir_template = '/dev/shm/tmp.sbuild.XXXXXXXXXX';

The first option sets the sbuild backend to unshare, whereas unshare_tmpdir_template is needed on Bookworm to ensure that the build process runs in memory rather than on disk for performance reasons. Starting with Trixie, /tmp is by default a tmpfs so the setting won’t be needed anymore.

Packages for different architectures can now be built as follows:

# Tarball used: ~/.cache/sbuild/unstable-arm64.tar $ sbuild --dist=unstable hello # Tarball used: ~/.cache/sbuild/unstable-armhf.tar $ sbuild --dist=unstable --arch=armhf hello

If you have any comments or suggestions about any of this, please let me know.

Categories: FLOSS Project Planets

Specbee: The All-New Drupal CMS: Find Answers to Your 13 Most Common Questions

Planet Drupal - Tue, 2024-11-26 01:25
We know you love FAQs! We'll answer your questions you have about the all-new Drupal CMS (yep, 13 of them). From features to migration, find out everything you need to know about this groundbreaking new CMS.
Categories: FLOSS Project Planets

Armin Ronacher: Constraints are Good: Python's Metadata Dilemma

Planet Python - Mon, 2024-11-25 19:00

There is currently an effort underway to build a new universal lockfile standard for Python, most of which is taking place on the Python discussion forum. This initiative has highlighted the difficulty of creating a standard that satisfies everyone. It has become clear that different Python packaging tools are having slightly different ideas in mind of what a lockfile is supposed to look like or even be used for.

In those discussions however also a small other aspect re-emerged: Python has a metadata problem. Python's metadata system is too complex and suffers from what I would call “lack of constraints”.

JavaScript: Example of Useful Constraints

JavaScript provides an excellent example of how constraints can simplify and improve a system. In JavaScript, metadata is straightforward. Whether you develop against a package locally or if you are using a package from npm, metadata represents itself the same way. There is a single package.json file that contains the most important metadata of a package such as name, version or dependencies. This simplicity imposes significant but beneficial constraints:

  • There is a 1:1 relationship between an npm package and its metadata. Every npm package has a single package.json file that is the source of truth of metadata. Metadata is trivially accessible, even programmatically, via require('packageName/package.json').
  • Dependencies (and all other metadata) are consistent across platforms and architectures. Platform-specific binaries are handled via a filter mechanism (os and cpu) paired with optionalDependencies. [1]
  • All metadata is static, and updates require explicit changes to package.json prior to distribution or installation. Tools are provided to manipulate that metadata such as npm version patch which will edit the file in-place.

These constraints offer several benefits:

  • Uniform behavior regardless of whether a dependency is installed locally or from a remote source. There is not even a filesystem layout difference between what comes from git or npm. This enables things like replacing an installed dependency with a local development copy, without change in functionality.
  • There is one singular source of truth for all metadata. You can edit package.json and any consumer of that metadata can just monitor that file for changes. No complex external information needs to be consulted.
  • Resolvers can rely on a single API call to fetch dependency metadata for a version, improving efficiency. Practically this also means that the resolver only needs to hit a single URL to retrieve all possible dependencies of a dependency. [2]
  • It makes auditing much easier because there are fewer moving parts and just one canonical location for metadata.
Python: The Cost of Too Few Constraints

In contrast, Python has historically placed very few constraints on metadata. For example, the old setup.py based build system essentially allowed arbitrary code execution during the build process. At one point it was at least strongly suggested that the version produced by that build step better match what is uploaded to PyPI. However, in practice, if you lie about the version that is okay too. You could upload a source distribution to PyPI that claims it's 2.0 but will in fact install 2.0+somethinghere or a completely different version entirely.

What happens is that both before a package is published to PyPI and when a package is installed locally after downloading, the metadata is generated from scratch. Not only does that mean the metadata does not have to match, it also means that it's allowed to be completely different. It's absolutely okay for a package to claim it depends on cool-dependency on your machine, but on uncool-dependency on my machine. Or to dependent on different packages depending on the time of the day of the phase of the moon.

Editable installs and caching are particularly problematic since metadata could become invalid almost immediately after being written. [3]

Some of this has been somewhat improved because the new pyproject.toml standard encourages static metadata. However build systems are entirely allowed to override that by falling back to what is called “dynamic metadata” and this is something that is commonly done.

In practice this system incurs a tremendous tax to everybody that can be easily missed.

  • Disjointed and complex metadata access: there is no clear relationship of PyPI package name and the installed Python modules. If you know what the PyPI package name is, you can access metadata via importlib.metadata. Metadata is not read from pyproject.toml, even if it's static, instead it takes the package name and it accesses the metadata from the .dist-info folder (most specifically the METADATA file therein) installed into site-packages.

  • Mandatory metadata re-generation: As a consequence if you edit pyproject.toml to edit a piece of metadata, you need to re-install the package for that metadata to be updated in the .dist-info. People commonly forget doing that, so desynchronized metadata is very common. This is true even for static metadata today!

  • Unclear cache invalidation: Because metadata can be dynamic, it's not clear when you should automatically re-install a package. It's not enough to just track pyproject.toml for changes when dynamic metadata is used. uv for instance has a really complex, explicit cache management system so one can help uv detect outdated metadata. This obviously is non-standardized, requires uv to understand version control systems and is also not shared with other tools. For instance if you know that the version information incorporates the git hash, you can tell uv to pay attention to git commits.

  • Fragmented metadata storage: even where generated metadata is stored is complex. Different systems have slightly different behavior for storing that metadata.

    • When working locally (eg: editable installs) what happens depends on the build system:
      • If setuptools is used, metadata written into two locations. The legacy <PACKAGE_NAME>.egg-info/PKG-INFO file. Additionally it's placed in the new location for metadata inside site-packages in a <PACKAGE_NAME>.dist-info/METADATA file.
      • If hatch and most other modern build systems are used, metadata is only written into site-packages. (into <PACKAGE_NAME>.dist-info/METADATA)
      • If no build system is configured it depends a bit on the installer. pip will even for an editable install build a wheel with setuptools, uv will only build a wheel and make the metadata available if one runs uv build. Otherwise the metadata is not available (in theory it could be found in pyproject.toml for as long as it's not dynamic).
    • For source distributions (sdist) first the build step happens as in the section before. Afterwards the metadata is thrown into a PKG-INFO file. It's currently placed in two locations in the sdist: PKG-INFO in the root and <PACKAGE_NAME>.egg-info/PKG-INFO. That metadata however I believe is only used for PyPI, when installing the sdist locally the metadata is regenerated from pyproject.toml (or if setuptools is used setup.py). That's also why metadata can change from what's in the sdist to what's there after installation.
    • For wheels the metadata is placed in <PACKAGE_NAME>.dist-info/METADATA exclusively. Wheels have static metadata, so no build step is taking place. What is in the wheel is always used.
  • Dynamic metadata makes resolvers slow: Dynamic metadata makes the job of resolvers and installers very hard and slows them down. Today for instance advanced resolvers like poetry or uv sometimes are not able to install the right packages, because they assume that dependency metadata is consistent across sdists and wheels. However there are a lot of sdists available on PyPI that publish incomplete dependency metadata (just whatever the build step for the sdist created on the developer's machine is what is cached on PyPI).

    Not getting this right can be the difference of hitting one static URL with all the metadata, and downloading a zip file, creating a virtualenv, installing build dependencies, generating an entire sdist and then reading the final generated metadata. Many orders of magnitude difference in time it takes to execute.

    This also extends to caching. If the metadata can constantly change, how would a resolver cache it? Is it required to build all possible source distributions to determine the metadata as part of resolving?

  • Cognitive complexity: The system introduces an enormous cognitive overhead which makes it very hard to understand for users, particularly when things to wrong. Incorrectly cached metadata can be almost impossible to debug for a user because they do not understand what is going on. Their pyproject.toml shows the right information, yet for some reason it behaves incorrectly. Most people don't know what "egg info" or "dist info" is. Or why an sdist has metadata in a different location than a wheel or a local checkout.

    Having support for dynamic metadata also means that developers continue to maintain elaborate and confusing systems. For instance there is a plugin for hatch that dynamically creates a readme [4], requiring even arbitrary Python code to run to display documentation. There are plugins to automatically change versions to incorporate git version hashes. As a result to figure out what version you actually have installed it's not just enough to look into a single file, you might have to rely on a tool to tell you what's going on.

Moving The Cheese

The challenge with dynamic metadata in Python is vast, but unless you are writing a resolver or packaging tool, you're not going to experience the pain as much. You might in fact quite enjoy the power of dynamic metadata. Unsurprisingly bringing up the idea to remove it is very badly received. There are so many workflows seemingly relying on it.

At this point fixing this problem might be really hard because it's a social problem more than a technical one. If the constraint would have been placed there in the first place, these weird use cases would never have emerged. But because the constraints were not there, people were free to go to town with leveraging it with all the consequences it causes.

I think at this point it's worth moving the cheese, but it's unclear if this can be done through a standard. Maybe the solution will be for tools like uv or poetry to warn if dynamic metadata is used and strongly discourage it. Then over time the users of packages that use dynamic metadata will start to urge the package authors to stop using it.

The cost of dynamic metadata is real, but it's felt only in small ways all the time. You notice it a bit when your resolver is slower than it has to, you notice it if your packaging tool installs the wrong dependency, you notice it if you need to read the manual for the first time when you need to reconfigure your cache-key or force a package to constantly reinstall, you notice it if you need to re-install your local dependencies over and over for them not to break. There are many ways you notice it. You don't notice it as a roadblock, just as a tiny, tiny tax. Except that is a tax we all pay and it makes the user experience significantly worse compared to what it could be.

The deeper lesson here is that if you give developers too much flexibility, they will inevitably push the boundaries and that can have significant downsides as we can see. Because Python's packaging ecosystem lacked constraints from the start, imposing them now has become a daunting challenge. Meanwhile, other ecosystems, like JavaScript's, took a more structured approach early on, avoiding many of these pitfalls entirely.

[1]You can see how this works in action for sentry-cli for instance. The @sentry/cli package declares all its platform specific dependencies as optionalDependencies (relevant package.json). Each platform build has a filter in its package.json for os and cpu. For instance this is what the arm64 linux binary dependency looks like: package.json. npm will attempt to install all optional dependencies, but it will skip over the ones that are not compatible with the current platform. [2]For @sentry/cli at version 2.39.0 for instance this means that this singular URL will return all the information that a resolver needs: registry.npmjs.org/@sentry/cli/2.39.0 [3]A common error in the past was to receive a pkg_resources.DistributionNotFound exception when trying to run a script in local development [4]I got some flak on Bluesky for throwing readme generators under the bus. While they do not present the same problem when it comes to metadata like dependencies and versions do, they do still increase the complexity. In an ideal world what you find in site-packages represents what you have in your version control and there is a README.md file right there. That's what you have in JavaScript, Rust and plenty of other ecosystems. What we have however is a build step (either dynamic or copying) taking that readme file, and placing it in a RFC 5322 header encoded file in a dist info. So instead of "command clicking" on a dependency and finding the readme, we need special tools or arcane knowledge if we want to read the readme files locally.
Categories: FLOSS Project Planets

Sandro Knauß: Akademy 2024 in Würzburg

Planet Debian - Mon, 2024-11-25 19:00

In order to prepare for the Akademy I started some days before to give my Librem 5 ( an Open Hardware Phone) another try and ended up with a non starting Plasma 6. Actually this issue was known already, but hasn't been addressed. In the end I reached the Akademy with my Librem 5 having phosh installed (which is Gnome based), in order to have something working.

I met Bushan and Bart who took care and the issue was fixed two days later I could finally install Plasma 6 on it. The last time I tested my Librem 5 with Plasma 5 it felt sluggish and not well working. But this time I was impressed how well the system reacts. Sure there are some things here and there, but in the bigger picture it is quite useable. One annoying issue is that the camera is only working with one app and the other issue is the battery capacity, you have to charge it once a day. Because of missing a QR reader that can use the camera, getting data to the phone was quite challenging. Unfortunately the conference Wifi separated the devices and I couldn't use KDE Connect to transfer data. In the end the only way to import data was taking five photos from the QR Code to import my D-Ticket to Itinerary.

With a device with Plasma Mobile, it directly was used for a experiment: How well does Dolphin works on a Plasma Mobile device. Together with Felix Ernst we tried it out and were quite impressed, that Dolphin does work very well on Plasma Mobile, after some simple modifications on the UI. That resulted in a patch to add a mobile UI for Dolphin !826.

With more time to play with my Librem 5 I also found an bug in KWeather, that is missing a Refresh option, when used in a Plasma Mobile environment #493656.

Akademy is a good place to identify and solve some issues. It is always like that, you chat with someone and they can tell you who to ask to answer the concrete question and in the end you can solve things, that seems unsolvable in the beginning.

There was also time to look into the travelling app Itinerary. A lot people are faced with a lot of real world issues, when not in their home town. Itinerary is the best traveling apps I know about. It can import nearly every ticket you have and can get location information from restaurant websites and allow routing to that place. It does add many useful information, while traveling like current delays, platform changes, live updates for elevator, weather information at the destination, a station map and all those features with strong focus on privacy.

In detail I found some small things to improve:

  • If you search for a bus ride and enter the correct name for the bus stop, it will still add some walk from and to the station. The issue here is that we use different backends and not all backends share the same geo coordinate. That's why Itinerary needs to add some heuristics to delete those paths.

  • Instead of displaying just a small station map of one bus stop in the inner city, it showed complete Würzburg inner city, as there is one big park around the inner city (named "Ringpark").

  • Würzburg has a quite big bus station but the platform information were missing in the map, so we tweaked the CSS to display the platform. To be sure, that we don't fix only Würzburg, we also looked at Greifswald and Aix-en-Provence if they are following the same name scheme.

I additionally learned that it has a lot of details that helps people who have special needs. That is the reason why Daniel Kraut wants to get Itinerary available for iOS. As spoken out, that Daniel wants to reach this goal, others already started to implement the first steps to build apps for iOS.

This year I was volunteering in helping out at Akademy. For me it was a lot of fun to meet everyone at the infodesk or help the speakers setup the beamer and microphone. It is also a good opportunity to meet many new faces and get in contact with them. I see also room for improvement. As we were quite busy at the Welcome Event to get out the badges to everyone, I couldn't answer the questions from newcomers, as the queue was too long. I propose that some people volunteer to be available for questions from newcomers. Often it is hard for newcomers to get their first contact(s) in a new community. There is a lot of space for improvement to make it easier for newcomers to join. Some ideas in my head are: Make an event for the newcomers to get them some links into the community and show that everyone is friendly. The tables at the BoFs should make a circle, so everyone can see each other. It was also hard for me to understand everyone as they mostly spoken towards the front. And then BoFs are sometimes full of very specific words and if you are not already deep in the topic you are lost. I can see the problem, on the one side BoFs are also the place where the person that knows the topic already wants to get things done. On the other side new comers join BoFs, are overwhelmed by to many new words get frustrated and think, that they are not welcome. Maybe at least everyone should present itself with name and ask new faces, why they joined the BoF to help them joining.

I'm happy, that the food provided for the attendees was very delicious and that I'm not the only one mostly vegetarian with a big amount to be vegan.

At the conference the KDE Eco initiation really caught me, as I see a lot of new possibilities in giving more reasons to switch to an Open Source system. The talk from Natalie was great to see how pupils get excited about Open Source and also help their grandparents to move to a Linux system. As I also will start to work as a teacher, I really got ideas what I can do at school. Together with Joseph and Nicole, we finally started to think about how to drive an exploration on what kind of old hardware is still KDE software running. The ones with the oldest hardware will get an old KDE shirt. For more information see #40.

The conference was very motivating for me, I also had still energy at the evening to do some Debian packaging and finally pushed kweathercore to Debian and started to work on KWeather. Now I'm even more interested in the KDE apps focusing the mobile world, as I now have some hardware that can actually use those apps.

I really enjoyed the workshop how to contribute to Qt by Volker Hilsheimer, especially the way how Volker explained things in a very friendly way, answered every question, sometime postponed some questions but came back to them later. All in all I now have a good overview how Qt is doing development and how I can fix bugs.

The daytrip to Rothenburg ob der Tauber was very interesting for me. It was the first time I visited the village. But in my memory it feels like I know the village already. I grew up with reading a lot of comic albums including the good SiFi comic album series "Yoku Tsuno" created by the Belgian writer Roger Leloup. Yoku Tsuno is an electronics engineer, raised in Japan but now living in Belgium. In "On the edge of life" she helps her friend Ingard, who actually lives in Rothenburg. Leloup invested a lot of time to travel to make the make his drawings as accurate as possible.

In order to not have a hard cut from Akademy to normal life, I had a lunch with Carlos, to discuss KDE Neon and how we can improve the interaction with Debian. In the future this should have less friction and make both communities work together more smoothly. Additionally as I used to develop on KDEPIM with the help of Docker images based on Neon I ask for a meta kf6 dev meta package. That should help to get rid of most hand written lists of dev packages in the Docker file in order to make it more simple for new contributors to start hacking on KDEPIM.

The rest of the day I finally found time to do the normal tourist stuff: Going to the Wine bridge and having a walk to the castle of Würzburg. Unfortunately you hear a lot of car noises up there, but I could finally relaxe in a Japanese designed garden.

Finally at Saturday I started my trip back. The trains towards Eberswalde are broken and I needed to find alternative routing. I got a little bit nervous, as it was the first time I travelled with my Librem 5 and Itinerary only and needed to reach the next train in less than two mins. With the indoor maps provided, I could prepare my run through the train station so I reached successfully my next train.

By the way, also if you only only use KDE software, I would recommend everyone to join Akademy ;)

Categories: FLOSS Project Planets

Akademy 2024 in Würzburg

Planet KDE - Mon, 2024-11-25 19:00

In order to prepare for the Akademy I started some days before to give my Librem 5 ( an Open Hardware Phone) another try and ended up with a non starting Plasma 6. Actually this issue was known already, but hasn't been addressed. In the end I reached the Akademy with my Librem 5 having phosh installed (which is Gnome based), in order to have something working.

I met Bushan and Bart who took care and the issue was fixed two days later I could finally install Plasma 6 on it. The last time I tested my Librem 5 with Plasma 5 it felt sluggish and not well working. But this time I was impressed how well the system reacts. Sure there are some things here and there, but in the bigger picture it is quite useable. One annoying issue is that the camera is only working with one app and the other issue is the battery capacity, you have to charge it once a day. Because of missing a QR reader that can use the camera, getting data to the phone was quite challenging. Unfortunately the conference Wifi separated the devices and I couldn't use KDE Connect to transfer data. In the end the only way to import data was taking five photos from the QR Code to import my D-Ticket to Itinerary.

With a device with Plasma Mobile, it directly was used for a experiment: How well does Dolphin works on a Plasma Mobile device. Together with Felix Ernst we tried it out and were quite impressed, that Dolphin does work very well on Plasma Mobile, after some simple modifications on the UI. That resulted in a patch to add a mobile UI for Dolphin !826.

With more time to play with my Librem 5 I also found an bug in KWeather, that is missing a Refresh option, when used in a Plasma Mobile environment #493656.

Akademy is a good place to identify and solve some issues. It is always like that, you chat with someone and they can tell you who to ask to answer the concrete question and in the end you can solve things, that seems unsolvable in the beginning.

There was also time to look into the travelling app Itinerary. A lot people are faced with a lot of real world issues, when not in their home town. Itinerary is the best traveling apps I know about. It can import nearly every ticket you have and can get location information from restaurant websites and allow routing to that place. It does add many useful information, while traveling like current delays, platform changes, live updates for elevator, weather information at the destination, a station map and all those features with strong focus on privacy.

In detail I found some small things to improve:

  • If you search for a bus ride and enter the correct name for the bus stop, it will still add some walk from and to the station. The issue here is that we use different backends and not all backends share the same geo coordinate. That's why Itinerary needs to add some heuristics to delete those paths.

  • Instead of displaying just a small station map of one bus stop in the inner city, it showed complete Würzburg inner city, as there is one big park around the inner city (named "Ringpark").

  • Würzburg has a quite big bus station but the platform information were missing in the map, so we tweaked the CSS to display the platform. To be sure, that we don't fix only Würzburg, we also looked at Greifswald and Aix-en-Provence if they are following the same name scheme.

I additionally learned that it has a lot of details that helps people who have special needs. That is the reason why Daniel Kraut wants to get Itinerary available for iOS. As spoken out, that Daniel wants to reach this goal, others already started to implement the first steps to build apps for iOS.

This year I was volunteering in helping out at Akademy. For me it was a lot of fun to meet everyone at the infodesk or help the speakers setup the beamer and microphone. It is also a good opportunity to meet many new faces and get in contact with them. I see also room for improvement. As we were quite busy at the Welcome Event to get out the badges to everyone, I couldn't answer the questions from newcomers, as the queue was too long. I propose that some people volunteer to be available for questions from newcomers. Often it is hard for newcomers to get their first contact(s) in a new community. There is a lot of space for improvement to make it easier for newcomers to join. Some ideas in my head are: Make an event for the newcomers to get them some links into the community and show that everyone is friendly. The tables at the BoFs should make a circle, so everyone can see each other. It was also hard for me to understand everyone as they mostly spoken towards the front. And then BoFs are sometimes full of very specific words and if you are not already deep in the topic you are lost. I can see the problem, on the one side BoFs are also the place where the person that knows the topic already wants to get things done. On the other side new comers join BoFs, are overwhelmed by to many new words get frustrated and think, that they are not welcome. Maybe at least everyone should present itself with name and ask new faces, why they joined the BoF to help them joining.

I'm happy, that the food provided for the attendees was very delicious and that I'm not the only one mostly vegetarian with a big amount to be vegan.

At the conference the KDE Eco initiation really caught me, as I see a lot of new possibilities in giving more reasons to switch to an Open Source system. The talk from Natalie was great to see how pupils get excited about Open Source and also help their grandparents to move to a Linux system. As I also will start to work as a teacher, I really got ideas what I can do at school. Together with Joseph and Nicole, we finally started to think about how to drive an exploration on what kind of old hardware is still KDE software running. The ones with the oldest hardware will get an old KDE shirt. For more information see #40.

The conference was very motivating for me, I also had still energy at the evening to do some Debian packaging and finally pushed kweathercore to Debian and started to work on KWeather. Now I'm even more interested in the KDE apps focusing the mobile world, as I now have some hardware that can actually use those apps.

I really enjoyed the workshop how to contribute to Qt by Volker Hilsheimer, especially the way how Volker explained things in a very friendly way, answered every question, sometime postponed some questions but came back to them later. All in all I now have a good overview how Qt is doing development and how I can fix bugs.

The daytrip to Rothenburg ob der Tauber was very interesting for me. It was the first time I visited the village. But in my memory it feels like I know the village already. I grew up with reading a lot of comic albums including the good SiFi comic album series "Yoku Tsuno" created by the Belgian writer Roger Leloup. Yoku Tsuno is an electronics engineer, raised in Japan but now living in Belgium. In "On the edge of life" she helps her friend Ingard, who actually lives in Rothenburg. Leloup invested a lot of time to travel to make the make his drawings as accurate as possible.

In order to not have a hard cut from Akademy to normal life, I had a lunch with Carlos, to discuss KDE Neon and how we can improve the interaction with Debian. In the future this should have less friction and make both communities work together more smoothly. Additionally as I used to develop on KDEPIM with the help of Docker images based on Neon I ask for a meta kf6 dev meta package. That should help to get rid of most hand written lists of dev packages in the Docker file in order to make it more simple for new contributors to start hacking on KDEPIM.

The rest of the day I finally found time to do the normal tourist stuff: Going to the Wine bridge and having a walk to the castle of Würzburg. Unfortunately you hear a lot of car noises up there, but I could finally relaxe in a Japanese designed garden.

Finally at Saturday I started my trip back. The trains towards Eberswalde are broken and I needed to find alternative routing. I got a little bit nervous, as it was the first time I travelled with my Librem 5 and Itinerary only and needed to reach the next train in less than two mins. With the indoor maps provided, I could prepare my run through the train station so I reached successfully my next train.

By the way, also if you only only use KDE software, I would recommend everyone to join Akademy ;)

Categories: FLOSS Project Planets

KDE Plasma 6.2.4, Bugfix Release for November

Planet KDE - Mon, 2024-11-25 19:00

Tuesday, 26 November 2024. Today KDE releases a bugfix update to KDE Plasma 6, versioned 6.2.4.

Plasma 6.2 was released in October 2024 with many feature refinements and new modules to complete the desktop experience.

This release adds three weeks' worth of new translations and fixes from KDE's contributors. The bugfixes are typically small but important and include:

  • libkscreen Doctor: clarify the meaning of max. brightness zero. Commit. Fixes bug #495557
  • Plasma Workspace: Battery Icon, Add headphone icon. Commit.
  • Plasma Audio Volume Control: Fix speaker test layout for Pro-Audio profile. Commit. Fixes bug #495752
View full changelog
Categories: FLOSS Project Planets

Ruqola 2.3.2

Planet KDE - Mon, 2024-11-25 19:00

Ruqola 2.3.2 is a feature and bugfix release of the Rocket.chat app.

It includes many fixes for RocketChat 7.0.

New features:

  • Fix administrator refresh user list
  • Fix menu when we select video conference message
  • Fix RocketChat 7.0 server support
  • Fix create video message
  • Fix update cache when we change video/attachment description
  • Fix export message job
  • Fix show userOffline when we have a group
  • Fix enable/disable ok button when search room in team dialog
  • Fix crash when we remove room in team dialog
  • Fix update channel selection when we reconnect server

URL: https://download.kde.org/stable/ruqola/
Source: ruqola-2.3.2.tar.xz
SHA256: 57c8ff6fdeb4aba286425a1bc915db98ff786544a3ada9dec39056ca4b587837
Signed by: E0A3EB202F8E57528E13E72FD7574483BB57B18D Jonathan Riddell jr@jriddell.org
https://jriddell.org/jriddell.pgp

Categories: FLOSS Project Planets

Talking Drupal: Talking Drupal #477 - Drupal Association CTO Then & Now

Planet Drupal - Mon, 2024-11-25 15:00

Today we are talking about being the CTO of the Drupal Association, How the job has changed, and How its impacted Drupal with guests Josh Mitchell & Tim Lehnen. We’ll also cover Automatic Anchors as our module of the week.

For show notes visit: https://www.talkingDrupal.com/477

Topics
  • How long ago were you CTO Josh
  • Tim when did you take over
  • DA infrastructure
  • Drupal Credit System
  • Josh's proudest moment
  • Tim's proudest moment
  • Growth
  • Josh if you could do one thing differently
  • Tim if you could make one change
  • Future of the CTO job
Resources Guests

Tim Lehnen - aspenthornpress.com hestenet

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Joshua "Josh" Mitchell - joshuami.com joshuami

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you ever wanted headings on your Drupal site to have unique id values, so links can be created to take users to specific parts of any page? There’s a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Jun 2020 by Chris Komlenic (komlenic) of Penn State
    • Versions available: 2.1.1-beta1, which supports Drupal 8.8, 9, and 10
  • Maintainership
    • Test coverage
    • Number of open issues: x open issues, y of which are bugs against the current branch
  • Usage stats:
    • 137 sites
  • Module features and usage
    • By default, the module automatically generates ids on , , , , and elements within the page content
    • Even if two headings have the same content, the module will make sure their ids are unique, as well as making sure they are i18n-friendly, use hyphens instead of spaces, and are short enough to be useful
    • The module won’t interfere with or change manually-added or already-existing HTML ids
    • There’s a permission to view helpful links on each heading that the ids obvious and easy to copy
    • Configuration options include the root element it should look within (defaults to the body tag), which elements should get ids, what content to use for the displayed links, and whether or not generate ids on admin pages
Categories: FLOSS Project Planets

Mike Driscoll: Black Friday Python Deals 2024

Planet Python - Mon, 2024-11-25 13:57

Black Friday and Cyber Monday are nearly here, so it’s time to do a Python sale! All my books and courses are 35% off until December 4th if you use this code: BF24 at checkout.

You can learn about any of the following topics in my books and courses:

  • Basic Python (Python 101)
  • PDF Processing (ReportLab book)
  • Excel and Python
  • Image processing (Pillow book)
  • Python Logging (NEW this year!)
  • Jupyter Notebook
  • JupyterLab 101 (NEW this month!)
  • and more

Check out my books and courses today!

Other Python Sales

The post Black Friday Python Deals 2024 appeared first on Mouse Vs Python.

Categories: FLOSS Project Planets

Pages