Feeds

Celebrating 5 years at the Open Source Initiative: a journey of growth, challenges, and community engagement

Open Source Initiative - Wed, 2024-11-27 06:20

Reaching the five-year mark at the Open Source Initiative (OSI) has been a huge privilege. It’s been a whirlwind of progress, personal growth, and community engagement—filled with highs, great challenges, and plenty of Open Source celebrations. As I reflect on this milestone, it’s impossible not to feel both gratitude and excitement for what lies ahead. This isn’t just a story about my career—it’s about the evolution of Open Source, the incredible people I’ve worked with, and the values we’ve championed.

Joining OSI: first steps into the journey

Back in 2017, under the leadership of OSI’s General Manager Patrick Masson, I stepped into the role of Director of Community and Development at the OSI. That year, we began planning a celebration of the 20th anniversary of Open Source for 2018, a massive undertaking. This wasn’t just about throwing a party. It was a celebration of the immense impact Open Source has had on the global tech community.

And yet, the timing was crucial. Open Source, which had steadily grown over the past two decades, faced challenges on multiple fronts: from sustainability and diversity to startups attempting to redefine the term “Open Source” and obscure its principles. The emergence of faux Open Source licenses, like the Commons Clause and the Server Side Public License, only added to the urgency of defending the very core of our mission.

The 20th Anniversary world tour

We embarked on the OSI’s 20th Anniversary World Tour in 2018, organizing over 100 activities across 40 events globally, rallying the support of affiliates, sponsors, and the Open Source community at-large. The goal wasn’t just to celebrate but to affirm the values we held dear—transparency, collaboration, and freedom in software development. This culminated in the signing of the Affirmation of the Open Source Definition by over 40 foundations and corporations, a powerful statement that we would not back down in the face of challenges.

For the 20th Anniversary, the OSI contributed to 40 events worldwide

Stepping back amidst the pandemic

When the pandemic hit in 2020, everything came to a sudden halt. With schools closing and my young children—a one-year-old and a four-year-old—at home, the demands of balancing work and family life became overwhelming. I made the difficult decision to step down from my role at the OSI. It wasn’t easy to step away from something I loved, but at the time, it felt necessary.

A year later, I joined a security startup as Head of Community, where I led a cutting-edge Open Source project and held a leadership role at the Confidential Computing Consortium at the Linux Foundation. This new role allowed me to expand my expertise in community management and Open Source security—critical topics that would soon come to the forefront.

Return to the OSI: a new chapter

In early 2023, the startup I was working for had to close its doors unfortunately. But just as one chapter ended, another began. Stefano Maffulli, OSI’s new Executive Director, reached out to me with an opportunity to return to the OSI. This time, the focus was on addressing a growing concern: the clarity around licenses and security vulnerabilities in the Open Source supply chain.

I jumped at the chance to come back. In the beginning of this new chapter, I had the opportunity to work on two challenges: to manage the ClearlyDefined community, a project aimed at improving transparency in Open Source licensing and security, and organizing activities around the 25th Anniversary of Open Source.

The 25th Anniversary world tour

The 25th anniversary of Open Source marked an incredible milestone, not just for the OSI, but for the whole tech community. We organized activities across 36 conferences from around the world with a combined attendance of over 125,000 people. We contributed with 12 keynotes, 24 talks, 6 workshops, and 18 webinars.

For the 25th Anniversary, the OSI contributed to 36 events worldwide

It was a wonderful experience connecting with organizers of these conferences and engaging with speakers, volunteers and attendees. I personally contributed to 4 keynotes, 6 talks, 15 events, and co-organized the OSI track at All Things Open and the Deep Dive: AI webinars.

Throughout the year, our focus shifted from reviewing the past of Free and Open Source software to exploring the future of Open Source in this new era of AI.

Open Source AI Definition: a new challenge

With the popularization of Generative AI, many players in industry started using “Open Source AI” to describe their projects. Legislators around the world also started drafting laws that would have a huge impact on Open Source and AI developments. Stefano was already exploring this new challenge with the Deep Dive: AI series, but we realized that establishing an Open Source AI Definition was going to be critical. Along with the Board of Directors, we started planning a new direction for the OSI.

We organized several activities around Open Source and AI in partnership with major Open Source conferences, from talks and panels to workshops. We also launched forums for online discussions, and hosted webinars and town halls to make our activities more inclusive. The work felt more important than ever.

One of the biggest undertakings was organizing a multistakeholder co-design process, led by Mer Joyce, where we brought together global experts to establish a shared set of principles that can recreate permissionless, pragmatic and simplified collaboration for AI builders.

One of the projects I’m particularly proud of is the “Voices of the Open Source AI Definition” series. Through this initiative, we’ve been able to share the stories of the volunteers from the co-design process involved in shaping the Open Source AI Definition (OSAID). These stories highlight the diversity and passion of the community, bringing a human element to the often technical discussions around Open Source and AI.

Voices of the Open Source AI Definition

The work around the Open Source AI Definition developed by the OSI was cited over 100 times in the press worldwide, educating and countering misinformation. Our work was featured at The New York Times, The Verge, TechCrunch, ZDNET, InfoWorld, Ars Technica, IEEE Spectrum, MIT Technology Review, among other top media outlets.

ClearlyDefined: bringing clarity to licensing

As part of my community management role on the ClearlyDefined project, I had the opportunity to contribute under the exceptional leadership of E. Lynette Rayle (GitHub) and Qing Tomlinson (SAP), whose guidance was instrumental in driving the project forward. One of my major accomplishments was the creation of a brand-new website and comprehensive documentation

We engaged with the ORT (Open Source Review Toolkit) and Scancode communities, building stronger connections. We made important technical updates, including upgrading to the latest version of Scancode and expanding our license support beyond just SPDX, making it easier for developers and organizations to navigate the increasingly complex landscape of licensing. This important work led to the release of version 2.0 of ClearlyDefined.

Another highlight was a new harvester implementation for conda, a popular package manager with a large collection of pre-built packages for various domains, including data science, machine learning, scientific computing and more.

Additionally, the adoption of ClearlyDefined by the GUAC community from OpenSSF was a testament to the growing importance of our work in bringing clarity to the Open Source supply chain not just in terms of licensing but also security.

ClearlyDefined’s new features: LicenseRef, conda support, and GUAC integration

We presented our work across three continents: in Europe at ORT Community Days, in North America at SOSS Fusion, and in Asia at Open Compliance Summit

Finally, we made progress toward a more open governance model by electing leaders for the ClearlyDefined Steering and Outreach Committees.

Open Policy: from cybersecurity to AI

On the policy side, I had the privilege to work along with Deb Bryant and Simon Phipps, who  kept track of policies affecting Open Source software, in particular the Securing Open Source Software Act in the US and the Cyber Resilience Act in Europe. As for policies involving Open Source and AI, I followed the European AI Act and US AI Bill of Rights and contributed with a compilation of a list of compelling responses from nonprofit organizations and companies to NTIA’s AI Open Model Weights RFC.

OpenSource.net: fostering knowledge sharing

I also helped to launch OpenSource.net, a platform designed to foster knowledge sharing. Led by Editor-in-Chief Nicole Martinelli, this platform has become a space for diverse perspectives and contributions, furthering the reach and impact of our work.

As part of the Practical Open Source (POSI) program, which facilitates discussions on doing business with and for Open Source, I was able to bring contributions from outstanding entrepreneurs and intrapreneurs.

Looking ahead: excitement for the future

In the last 2 years at the OSI, I’ve had the privilege of publishing over 50 blog posts, as well as organizing, speaking at, and attending multiple events worldwide. The work we’ve done has been both challenging and rewarding, but it’s the community—the people who believe in the power of Open Source—that makes it all worthwhile.

As I celebrate five years at the OSI, I’m more energized than ever to continue this journey. The world of Open Source is evolving, and I’m excited to be part of shaping its future. There’s so much more to come, and I can’t wait to see where the next five years will take us.

This is more than just a personal milestone—it’s a celebration of the impact Open Source has had on the world and the endless possibilities it holds for the future.

You too can join the OSI: support our work

The work we do is only possible because of the passionate, engaged community that supports us. From advocating for Open Source principles to driving initiatives like the Open Source AI Definition and ClearlyDefined, every step forward has been powered by collaboration and shared commitment.

But there’s so much more to be done. The challenges facing Open Source today—from licensing to policy—are growing in scale and complexity. To meet them, we need your help. By joining or sponsoring the Open Source Initiative, you enable us to continue this vital work: educating, advocating, and building a stronger, more inclusive Open Source ecosystem.

Your support isn’t just a contribution; it’s an investment in the future of Open Source. Together, we can ensure that Open Source remains open for all. Join us in shaping the next chapter of this incredible journey.

Categories: FLOSS Research

ComputerMinds.co.uk: Putting 1000 sites behind Cloudflare

Planet Drupal - Wed, 2024-11-27 04:29
That’s a lot of sites!

Yes, it is! 10+ years ago we migrated 200 sites to a new server - and in 2024 we set up Cloudflare protection for well over 1000 sites.

Aegir is a hosting system built in Drupal, for Drupal. It lets you create and manage Drupal sites and all their databases, filesystems and virtual hosts. With Aegir, it’s easy to manage hundreds or thousands of sites via a simple UI. Each site has a node to represent it, and this project stored a whole bunch of additional Cloudflare metadata against the Site Nodes.

Keeping a PaaS product online at all times comes with a high level of responsibility. After code quality assurance and testing, DDOS attacks of all sizes and types are a high risk threat. The cost of protecting our availability, unsurprisingly, was non-trivial and became a point requiring fresh research and investment. Reducing the general load and the potential attack load on our servers would serve to support our quality of service.

In the Spring of 2024 we set up a proof of concept using Cloudflare, which would allow us to make a significant ongoing cost saving whilst also playing with some really cool APIs.

The plan

In order to put all our sites behind Cloudflare, we needed to:

* Get Aegir talking to Cloudflare via their API, and build the automatic processes to support the setup process
* Create a clear interface for starting and tracking setup per site
* Create a clear dashboard for tracking progress overall
* Go! Change the nameserver records for every domain, to point to Cloudflare

Here are some of the key interesting parts of our story (which had negligible downtime, btw!)

Interacting with Cloudflare’s API

It was greatly pleasing to find that Cloudflare let you do almost everything over an API (given the right authentication!). Given the scale of their company, it shouldn’t be a huge surprise I suppose!

With 1000 sites approx. to deal with, there was a strong drive to automate as much of this process as possible. So there was much celebration when it became clear that we would be able to do almost every step automatically, including:

* create everything we needed for the site domain, with an appropriate plan under our chosen account
* upload DNS records, query them and update them
* perform real-time ownership verification and SSL validation

Simple admin setup - Obtain DNS records for hundreds of domains, store them against Site Nodes, press GO

A large portion of the domains (about 500) are under our control, so we were able to bulk export the DNS records, process and save them against their Aegir Site Nodes. Some simple processing removed the SOA and NS records so that we would be able to send the records straight to Cloudflare when setup started.
These ‘easy’ sites, for which we had the DNS records, would be processed in bulk with a lot of Go! button clicks, and then making the relevant nameserver changes with the domain provider.
(The domain provider did offer to do bulk updates for us, but there seemed to be a 24h delay before action was taken - so it was quicker to do these changes ourselves.)
 

Cropped screen grab of our progress pie chart

 

 

Creating a self-service setup mechanism via Drupal config + settings.php

Domains that we only had nameserver control for would have to be updated by the customer. But how would they know what nameservers to set up, and how would they trigger the setup process? Stats to the rescue!
As part of the Drupal 10 rebuild of the platform we added a Statistics module, which collects a selection of data points from each site and passes them to the corresponding node on Aegir for storage. They’re then aggregated and sent back to the sites so that customers can compare their performance to the cohort averages.
We created a form interface for the user to trigger setup when ready, and then smuggled the outputs over to Aegir in amongst the performance stats 🤭 

This self service route did still require a lot of chasing, but generally performed well as it allowed the users to perform the nameserver change at their choice of timing, rather than requiring scheduled calls and appointments on top of an already high administrative load.

Surfacing useful data, stats and buttons for the team

When developing a technical tool that ideally needs a single fire-and-forget button to kick things off, you not only need that one button but also a lot of clear visual cues to help others understand what’s going on.

We tied the setup steps to interface outputs, with clear dependency messaging and reporting.
Reporting eventually included a message in our Slack channel ⭐️
 

Our custom Aegir Site dashboard, with various setup steps all nice and green
The results

After just a few weeks, we had 990 sites set up on Cloudflare - 90% of the most important setups. It turned out to be very difficult to get hundreds of different people, groups and stakeholders to make DNS nameserver changes quickly (even when you tell them it’s urgent!), so the process would continue a little longer.

Already we can measure the success - thanks to Cloudflare’s caching, we’ve seen decent reductions in bandwidth use and the number of requests hitting the server.
 

Our Cloudflare Dashboard on the Production Aegir server. Inactive sites didn't need our fuller protection setup steps, so the blue and pink bars don't match exactly.
Categories: FLOSS Project Planets

CLI++: Upgrade Your Command Line

Planet KDE - Wed, 2024-11-27 03:00

In a recent email, KDABian Leon Matthes highlighted some of his go-to command line tools for everyday use on Unix. His recommendations sparked a lively exchange among our colleagues, each sharing their own favorite utilities.

Many of these tools offer efficient alternatives to standard Unix programs, speeding up the workflow or otherwise enriching the development experience.

This article aims to serve as a resource for the wider community, encouraging others to explore these tools and upgrade their command line setup for improved productivity.

The following common Unix tools are listed in this document, with their alternatives specified in their respective sections:

Additionally, there are some tools here that do not replace common programs, but are extremely useful when working in a shell environment. These are broken into two broad categories:

Many of these tools can be considered examples of the “rewrite it in Rust” approach (RIIR), which in this document will be denoted with Ferris the crab: 🦀

Finally, there is a bonus Rust crate listed at the end that is helpful for writing CLI programs of your own.

Note: for scripting, stick with standard Unix tools, as they’re more widely distributed and will work in most other Unix environments. The tools in this article are meant to improve quality of life working within your own shell environment on a daily basis.

cd autojump

autojump is an alternative to cd, invoked with the command j. It jumps to what it determines to be the best fit for whatever directory name you give it. For example, j cxx-qt will change directory to the cxx-qt directory, even if it’s nested several levels deep from the current working directory.

If autojump doesn’t prioritize a directory correctly, its priority weight can be increased or decreased. j -i XXX will increase the priority weight of the current working directory by XXX, while j -d XXX will do the opposite and decrease the weight accordingly.

autojump also supports opening a file browser at a matched location rather than cd‘ing into it, by invoking jo rather than j.

Unfortunately, autojump has not been in active development for the past two years, and can break in some environments. The next tool, zoxide, will be more reliable if autojump breaks for you.

zoxide 🦀

zoxide is a similar program to autojump, actually drawing inspiration from it. Invoked with the command z, it remembers paths that have been visited in the past and matches the best fitting directory. The tool also supports interactive selection of the correct match via the zi command, which uses the tool fzf for fuzzy finding. fzf is listed later in this article.

zoxide can also be used the same way as plain cd, and can target absolute and relative paths. It can be configured to be invoked from the j and ji commands, or even cd to fully replace the cd command.

Finally, it is implemented in Rust, resulting in better performance than autojump.

ls lsd 🦀

lsd is a more feature-rich version of ls, including colors, icons, tree-view, additional formatting, and more. It is also highly configurable through a yaml config file.

eza 🦀

eza is a small, fast ls rewrite that knows about symlinks, extended attributes, and git, and has support for hyperlinks. Similar to lsd, it supports colors and icons, which are configurable with a yaml config file.

man tldr

This project has several clients written in different languages. The most mature client is an npm package.

tldr is an alternative to using man for most common use cases. It is a collection of pages that provides brief descriptions and usage examples for loads of programs, standard Unix and otherwise. It’s much faster if you don’t need to read about every option in detail in order to find the correct way to invoke a command or select the proper option quickly.

For example, if the man page for tar is too long to read when looking for the right option to do something, simply running tldr tar will show a list of examples that will usually have the options you need.

cheat.sh / cht.sh

cheat.sh and its associated shellscript cht.sh, provide access to a superpowered aggregation of cheatsheets for programming languages and CLI tools, including tldr pages and other sources.

It has a great system for querying cheatsheets, and you don’t even need to install a tool (though you can), as the query response can be retrieved by curl. It’s also usable inside editors, for easy referencing while programming.

The cheatsheet sources also include StackOverflow answers, which allow for flexible usage like so:

$ cht.sh js parse json

A nice way to use cheat.sh without installing anything is to put the following function in your .bashrc or .zshrc:

function cheat() { if ! curl -s "cheat.sh/${*//\ /+}"; then echo "Failed to get the cheatsheet" >&2 return 1 fi }

Then you can use it like

$ cheat tar

or

$ cheat cpp number to string find fd 🦀

fd is an alternative for many use cases of find.

While it is not as powerful as find, it is designed to be faster and more user-friendly for the majority of find‘s use cases.

No more find -name .rs — just use fd .rs and you’re good to go.

fd is super fast and will search through your entire filesystem in a matter of seconds (Leon recorded 1.4s on his machine with 177GB of files).

By default, fd will not include any hidden files that are ignored by .gitignore, which is very useful as it doesn’t give you any random junk in your build directory and speeds up the search even further.

Use --hidden and --no-ignore (or -H and -I, respectively) to search for everything.

fzf

fzf is a fuzzy-finder that can match files, command history, processes, hostnames, bookmarks, git commits, and more, with interactive selection. It’s also usable as a vim or neovim plugin.

grep ripgrep 🦀

ripgrep (rg as a command) provides an incredibly fast grep alternative with a simpler interface.

Did you ever have to wait for grep to finish searching a directory or had to look up the right options for specific usage? Well, no more! Ripgrep will search the contents of large build directories within milliseconds.

Just running rg Q_PROPERTY will search through the entire current directory if you don’t pipe something into it, which is very convenient and the way it should have been in the first place. The output is also a lot friendlier and easier to read.

Like fd, use --hidden and --no-ignore to search in hidden and ignored directories/files.

cat bat 🦀

bat, written by the creator of fd, is a superpowered Rust rewrite of cat. It provides syntax highlighting, line numbers, support for displaying non-printable characters, support for showing multiple files at once, and pipes to less by default for large output. It can also be integrated with fzf, find, fd, ripgrep (rg), git show, git diff, man, and more. A tool you’ll read about later in this document, delta, uses bat as a dependency to display diffs.

diff / git diff diff-so-fancy

diff-so-fancy is an alternative to vanilla git diff that improves readability, with better formatting, colors, and highlighting. Configure git to use diffsofancy as its pager, and git diff will provide the more readable diffsofancy output.

delta 🦀

delta is an alternative to diffsofancy that’s written in Rust. It includes additional features, like line numbers, syntax highlighting, side-by-side, improved navigation between files in large diffs, hyperlinked commit hashes, improvements for display of git blame and merge conflicts, and the ability to syntax-highlight grep output. delta uses bat under the hood and, as such, they can share color themes.

prompts & shells starship 🦀

starship is a prompt written in Rust, which can be used with a number of popular shells including bash, zsh, fish, and even Windows shells like cmd and PowerShell.

The prompt is both stylish and informative, providing information like git branch, versioning, and more in the prompt itself. It also indicates the time length of command execution, whenever the run lasts for more than a few seconds — very helpful for gauging the duration of a build. This prompt should be utilized with a NerdFont for icon support.

Starship is also extremely configurable, so any details can be omitted or added; the look and feel of the prompt can be completely customized, etc.

zsh prompts & plugins

Some shell users argue that switching from bash is definitely worthwhile, with zsh being the favorite choice for many. It’s highly configurable and has a rich plugin ecosystem.

oh-my-zsh is a zsh framework that makes completions, searching in history, the prompt, etc., much nicer than the defaults. It can, of course, be tweaked much further from there. If you find oh-my-zsh to be a bit bloated, there are more lightweight alternatives such as ZimFW or Zinit.

Some zsh plugins can also serve as alternatives to previously mentioned tools. There are quite a few useful plugins listed here (they should work with any plugin manager): https://github.com/unixorn/awesome-zsh-plugins.

Some that stand out include:

  • zsh-vi-mode
    • a proper Vim mode for the terminal, much better than the default one
  • fast-syntax-highlighting
    • nice while-you-type syntax highlighting in the shell
  • fzf-marks
    • alternative workflow to zoxide and similar cd improvements
    • this can also be used with bash
  • zsh-autoenv
    • customize environment variables by directory, similar to the tool direnv mentioned later

Note: If you are interested in switching away from bash but don’t like zsh, consider trying fish.

Update Systems topgrade 🦀

topgrade is a tool which conveniently updates several package managers with one command.

Just run topgrade and every package manager under the sun will be updated, one after the other. This even includes flatpaks and updates to your systems package manager, very convenient.

Misc. direnv

direnv is an extension for several common Unix shells, focused on environment management. More specifically, it is intended to unclutter .profile and export different environment variables depending on the current directory.

On each prompt, it checks for the existence of a .envrc file in the current directory, otherwise looking for .env, and loads or unloads environment variables accordingly. It’s a single executable and very fast.

direnv is also extensible with bash files in .config.

blobdrop

blobdrop enables drag-n-drop of files from a terminal window, which can be very convenient when using the command line as a file browser. It was written by fellow kdabian Magnus Gross.

bonus! clap

bonus is not quite a tool, but a Rust crate used frequently for writing CLI tools, called clap.

clap is used in the codebases of a majority of the Rust tools in this document, as it’s incredibly useful. It’s the reason most Rust CLI tools have great UX.

Using the derive feature, you can simply mark up a struct of data you need from your command line arguments and clap will generate a beautiful and convenient CLI interface for you. Just take a look at the derive documentation to get an idea of what this looks like.

Other Tools

We hope these tools can improve your workflow, as they certainly make our lives on the command line far more enjoyable.

If you have any additions to this list, feel free to discuss them in the comments.

About KDAB

If you like this article and want to read similar material, consider subscribing via our RSS feed.

Subscribe to KDAB TV for similar informative short video content.

KDAB provides market leading software consulting and development services and training in Qt, C++ and 3D/OpenGL. Contact us.

The post CLI++: Upgrade Your Command Line appeared first on KDAB.

Categories: FLOSS Project Planets

Django Weblog: Django 6.x Steering Council Candidate Registration

Planet Python - Wed, 2024-11-27 02:00

Following our announcement of the 6.x Steering Council elections, today we open candidate registrations. Registrations will be open until December 4 2024 at 23:59 Anywhere on Earth.

Register as a Steering Council candidate

Eligibility

Candidate eligibility requirements are defined in DEP 12: The Steering Council. To be qualified for elections, we require both of the following:

  • A history of substantive contributions to Django or the Django ecosystem. This history must begin at least 18 months prior to the individual's candidacy for the Steering Council, and include substantive contributions in at least two of these bullet points:
    • Code contributions on Django projects or major third-party packages in the Django ecosystem
    • Reviewing pull requests and/or triaging Django project tickets
    • Documentation, tutorials or blog posts
    • Discussions about Django on the django-developers mailing list or the Django Forum
    • Running Django-related events or user groups
  • A history of engagement with the direction and future of Django. This does not need to be recent, but candidates who have not engaged in the past three years must still demonstrate an understanding of Django's changes and direction within those three years.

If you have questions about the election please contact foundation@djangoproject.com or ask on the Django forum.

Categories: FLOSS Project Planets

Dries Buytaert: Introducing Drupal Starshot's product strategy

Planet Drupal - Tue, 2024-11-26 21:59

I'm excited to share the first version of Drupal Starshot's product strategy, a document that aims to guide the development and marketing of Drupal Starshot. To read it, download the full Drupal Starshot strategy document as a PDF (8 MB).

This strategy document is the result of a collaborative effort among the Drupal Starshot leadership team, the Drupal Starshot Advisory Council, and the Drupal Core Committers. We also tested it with marketers who provided feedback and validation.

Drupal Starshot and Drupal Core

Drupal Starshot is the temporary name for an initiative that extends the capabilities of Drupal Core. Drupal Starshot aims to broaden Drupal's appeal to marketers and a wider range of project budgets. Our ultimate goal is to increase Drupal's adoption, solidify Drupal's position as a leading CMS, and champion an Open Web.

For more context, please watch my DrupalCon Portland keynote.

It's important to note that Drupal Starshot and Drupal Core will have separate, yet complementary, product strategies. Drupal Starshot will focus on empowering marketers and expanding Drupal's presence in the mid-market, while Drupal Core will prioritize the needs of developers and more technical users. I'll write more about the Drupal Core product strategy in a future blog post once we have finalized it. Together, these two strategies will form a comprehensive vision for Drupal as a product.

Why a product strategy?

By defining our goals, target audience and necessary features, we can more effectively guide contributors and ensure that everyone is working towards a common vision. This product strategy will serve as a foundation for our development roadmap, our marketing efforts, enabling Drupal Certified Partners, and more.

Drupal Starshot product strategy TL;DR

For the detailed product strategy, please read the full Drupal Starshot strategy document (8 MB, PDF). Below is a summary.

Drupal Starshot aims to be the gold standard for marketers that want to build great digital experiences.

We'd like to expand Drupal's reach by focusing on two strategic shifts:

  1. Prioritizing Drupal for content creators, marketers, web managers, and web designers so they can independently build websites. A key goal is to empower these marketing professionals to build and manage their websites independently without relying on developers or having to use the command line or an IDE.
  2. Extending Drupal's presence in the mid-market segment, targeting projects with total budgets between $30,000 and $120,000 USD (€25,000 to €100,000).

Drupal Starshot will differentiate itself from competitors by providing:

  1. A thoughtfully designed platform for marketers, balancing ease of use with flexibility. It includes smart defaults, best practices for common marketing tasks, marketing-focused editorial tools, and helpful learning resources.
  2. A growth-oriented approach. Start simple with Drupal Starshot's user-friendly tools, and unlock advanced features as your site grows or you gain expertise. With sophisticated content modeling, efficient content reuse across channels, and robust integrations with other leading marketing technologies, ambitious marketers won't face the limitations of other CMSs and will have the flexibility to scale their site as needed.
  3. AI-assisted site building tools to simplify complex tasks, making Drupal accessible to a wider range of users.
  4. Drupal's existing competitive advantages such as extensibility, scalability, security, accessibility, multilingual support, and more.
What about ambitious site builders?

In the past, we used the term ambitious site builders to describe Drupal's target audience. Although this term doesn't appear in the product strategy document, it remains relevant.

While the strategy document is publicly available, it is primarily an internal guide. It outlines our plans but doesn't dictate our marketing language. Our product strategy's language purposly aligns with terms used by our target users, based on persona research and interviews.

To me, "ambitious site builders" includes all Drupal users, from those working with Drupal Core (more technically skilled) to those working with Drupal Starshot (less technical). Both groups are ambitious, with Drupal Starshot specifically targeting "ambitious marketers" or "ambitious no-code developers".

Give feedback

The product strategy is a living document, and we value input. We invite you to share your thoughts, suggestions, and questions in the product strategy feedback issue within the Drupal Starshot issue queue.

Get involved

There are many opportunities to get involved with Drupal Starshot, whether you're a marketer, developer, designer, writer, project manager, or simply passionate about the future of Drupal. To learn more about how you can contribute to Drupal Starshot, visit https://drupal.org/starshot.

Thank you

I'd like to thank the Drupal Starshot leadership team, the Drupal Starshot Advisory Council, and the Drupal Core Committers for their input on the strategy. I'm also grateful for the marketers who provided feedback on our strategy, helping us refine our approach.

Categories: FLOSS Project Planets

Drupal Association blog: Meet Stella Power: Advocating for Community Growth and Diversity

Planet Drupal - Tue, 2024-11-26 19:49

We’re thrilled to introduce Stella Power, one of the newest members elected to the Drupal Association Board in October. Stella is the Managing Director of Annertech, a digital agency based in Dublin. She holds an MSc in Software Engineering from Dublin City University and a BA in Computational Physics from Trinity College Dublin. Since founding Annertech in 2008, she has grown the company into a team of approximately 40 members spanning the EU and beyond.

A passionate advocate for women in tech, Stella has been featured in numerous publications. In 2021, she was honored with the Women in Digital Award at Ireland’s National Digital Awards and, in 2022, received a Women in Drupal Award in the Scale category.

Stella is also an open-source evangelist and a sought-after speaker at international conferences, where she shares her expertise on development best practices, service delivery, data migrations, and security. She continues to contribute actively to Drupal and serves on the Drupal Security Team, the DrupalCon Europe Advisory Board, and the Starshot Advisory Council, in addition to her role on the Drupal Association Board.

We’re excited to have Stella on the Board, and she recently shared her insights on this new chapter in her journey:

What are you most excited about when it comes to joining the Drupal Association Board?
I’m looking forward to playing a role in shaping the future of Drupal. For me, Drupal is more than just a fantastic CMS—it’s a community. Being on the board gives me an opportunity to give back in a different way, stepping back to focus on the big picture. It’s not about concentrating on one part of the community or one piece of the codebase. I’m truly excited about this new role.

What do you hope to accomplish during your time on the board?
A big part of the Drupal Association's focus is fostering the growth of the Drupal community and supporting the project’s vision to create a safe, secure, and open web for everyone. As my role at Annertech changed from development to management, so did my contributions to Drupal. Instead of just contributing code I started contributing more to the community, most often in event organizing or on various committees. I contribute because I want to give back. The Drupal community has given me so much, even a career. I want to ensure others can benefit from it like I have.

I am a big advocate of getting new people involved. I love listening to new voices, and getting fresh perspectives on challenges or issues facing the Drupal community, and I look forward to helping grow the diversity of our community and amplifying voices that haven’t been heard before. The newer members of our community have a completely different take on what they want from Drupal. I look forward to listening to what they want, and to their ideas of Drupal and for Drupal.

What specific skill or perspective do you contribute to the board?
I have a coding background, and with that comes a built-in problem-solving radar. Developers tend to approach challenges like puzzles, and I’ve developed strong skills in context-switching. If something isn’t working I can change tack and come at it from a different angle. It was the logical, problem-solving aspect that first drew me to tech, and I’d like to think that that's a valuable skill.

How has Drupal impacted your life or career?
"If I hadn’t found Drupal I probably would have ended up doing something completely different with my life. I used to be an avid ceramic artist and amateur photographer. It was mainly just for fun, as a hobby, but I wanted to grow it into something more. However, it was when I went to build a website to showcase my portfolio that I found Drupal, and then it was contributing to Drupal that became my hobby!

I loved being involved in the community and contributing, but it wasn’t until I attended my first DrupalCon—in Szeged—that I realized it could be more than just a hobby. I was introduced to people who were doing such incredible things with Drupal, who were helping clients achieve their goals by creating incredible websites, and that’s when I decided to found my own digital agency and Annertech was born.

Tell us something that the Drupal community might not know about you.
I studied physics at university and initially aspired to be an astrophysicist. I even worked as an intern at the Space Telescope Science Institute with the Hubble Telescope! I loved the problem-solving aspects of physics, but Drupal allowed me to combine my love for problem-solving with my creative side.

Share a favorite quote or piece of advice that has inspired you.
I love the principle of aggregate marginal gains. It’s the idea that if you improve by just 1% consistently, those small gains will add up to remarkable improvement. We live in such a hectic world, where it's easy to feel overwhelmed or that challenges are insurmountable. Small improvements are far less daunting, and when you make small improvements on a daily basis, the overall results can be incredible.

We’re excited to see the amazing contributions Stella will make during her time on the Drupal Association Board. Thank you, Stella, for dedicating your time and expertise to serving the Drupal community! Connect with Stella on LinkedIn

The Drupal Association Board of Directors comprises 13 members with nine are nominated for staggered three-year terms, two are elected by Drupal Association members, and one seat is reserved for the Drupal Project Founder, Dries Buytaert, while another is reserved for the immediate past chair. The Board meets twice in person and four times virtually each year. It oversees policy development, executive director management, budget approvals, financial reporting, and fundraising efforts.

Categories: FLOSS Project Planets

Brett Cannon: What the PSF Conduct WG does

Planet Python - Tue, 2024-11-26 18:28

In the past week I had two people separately tell me what they thought the PSF Conduct WG did and both were wrong (and incidentally in the same way). As such, I wanted to clarify what exactly the WG does for people in case others also misunderstand what the group does.

⚠️I am a member of the PSF Conduct WG (whose membership you can see by checking the charter), and have been for a few years now. That means I both speak from experience but I also may be biased in some way that I&aposm not aware of. But since this post is meant to be objective I&aposm hoping there aren&apost any concerns about bias.🔔There are a myriad of conduct groups in the Python community beyond the PSF Conduct WG, and they all work differently. For example, conferences like PyCon US have their own, the Django and NumFOCUS communities have their own, etc. This post is about a specific group and does not represent other ones.

I would say there are 4 things the Conduct WG actually does (in order from least to most frequent):

  1. Maintain the PSF Code of Conduct (CoC)
  2. Let PSF members know when they have gone against the CoC in a public space
  3. Record disciplinary actions taken by groups associated with the PSF
  4. Provide conduct advice to Python groups

Let&aposs talk about what each of these mean.

Maintain the CoC

In September 2019 the CoC was rewritten from a two paragraph "don&apost be mean" CoC to a more professional one. That rewrite actually is what led to the establishment of the Conduct WG in the first place. Since then, the Conduct WG is in charge of making any changes as necessary to the document. But ever since the rewrite was completed, it is rarely touched.

Let PSF members know when they have gone against the CoC publicly

Becoming a member of the PSF requires that you "agree to the community Code of Conduct". As such, if you are found to be running afoul of the CoC publicly where you also declare your PSF membership, then the Conduct WG will reach out to you and kindly let you know what you did wrong and to please not do that (technically you could get referred to the PSF board to have your membership revoked if you did something really bad, but I&aposm not aware of that ever happening).

But there are two key details about this work of the WG that I think people don&apost realize that are important. One is the Conduct WG does not go out on the internet looking for members who have done something that&aposs in violation of the CoC. What happens instead is people report to the WG when they have seen a PSF member behave poorly in public while promoting their PSF membership (and this tends to be Fellows more than the general members).

Two, this is (so far) only an issue if you promote the fact that you&aposre a PSF member. What you do in your life outside of Python is none of the WG&aposs concern, but if you, e,g., call out your PSF affiliation on your profile on X and then post something that goes against the CoC, then that&aposs a problem as that then reflects poorly on the PSF and the rest of the membership. Now, if someone were to very publicly come out as a member of some heinous organization even without talking about Python then that might be enough to warrant the Conduct WG saying something to the PSF board (and this probably applies more to Fellows than general members), but I haven&apost seen that happen.

Record CoC violations

If someone violates the CoC, some groups report them to the Conduct WG and we record who violated the CoC, how they violated, and what action was taken. The reason for this is to see if someone is jumping from group to group, causing conduct issues, but in a way that the larger pattern isn&apost being noticed by individual groups. But to be honest, not many groups report things (it is one more thing to do after dealing with a conduct issue which is exhausting on its own), and typically people who run afoul of the CoC where a pattern would be big enough to cause concern usually do it enough in one place as well, so the misconduct is noticed regardless.

Provide advice

The most common thing the Conduct WG does, by far, is provide advice to other groups who ask us for said advice based on the WG&aposs training and expertise. This can range from, "can you double-check our logic and reaction as a neutral 3rd-party?" to, "can you provide a recommendation on how to handle this situation?"

While this might be the thing the Conduct WG does the most, it also seems to be the most misunderstood. For instance, much like with emailing PSF members when they have violated the CoC publicly while promoting their PSF membership, the Conduct WG does not go out looking for people causing trouble. This is entirely driven by people coming to the WG with a problem. The closest thing I can think of the Conduct WG doing in terms of proactively reaching out is some group that got a grant from the PSF Grants WG did something wrong around the CoC that was reported to the Conduct WG that warrants us notifying the Grants WG of the problem. But the Conduct WG isn&apost snooping around the internet looking for places to give advice.

I have also heard folks say the Conduct WG "demanded" something, or "made" something happen. That is simply not true. The Conduct WG has no power to compel some group to do something (i.e. things like moderation and enforcement is handled by the folks who come to the Conduct WG asking for advice). As an example, let&aposs say the Python steering council came to the Conduct WG asking for advice (and that could be as open-ended as "what do you recommend?" to "we are thinking of doing this; does that seem reasonable to you?"). The Conduct WG would provide the advice requested, and that&aposs the end of it. The Conduct WG advised in this hypothetical, it didn&apost require anything. The SC can choose to enact the advice, modify it in some way, or flat-out ignore it; the Conduct WG cannot make the SC do anything (heck, the SC isn&apost even under the PSF&aposs jurisdiction, but that&aposs not an important detail here, just something else I have heard people get wrong). And this inability to compel a group to do something even extends to groups that come to the Conduct WG for advice even if they are affiliated with the PSF. Going back to the Grants WG example, we can&apost make the Grants WG pull someone&aposs grant or deny future grants, we can just let them know what we think. We can refer an issue to the PSF board, but we can&apost compel the board to do anything (e.g., if we warn a PSF member about their public conduct, we can&apost make them stop being a PSF member for it, the most we can do is inform the PSF board about what someone has done and potentially offer advice).

Having said all of that, anecdotally it seems that most groups that request a recommendation from the Conduct WG enact those recommendations. So you could say the Conduct WG was involved in some action that was taken based on the WG&aposs recommendation, but you certainly cannot assign full blame on the WG for the actions taken by other groups either.

Categories: FLOSS Project Planets

Morpht: Custom Meta Module joins Metatag: Unlocking advanced custom tag management

Planet Drupal - Tue, 2024-11-26 18:04
The Drupal Metatag module has been extended with a new submodule called Custom Meta. Custom Meta opens the way for the flexible definition of tags which can be used to drive custom applications outside the typical SEO requirements of a site.
Categories: FLOSS Project Planets

Kay Hayen: Nuitka Release 2.5

Planet Python - Tue, 2024-11-26 18:00

This is to inform you about the new stable release of Nuitka. It is the extremely compatible Python compiler, “download now”.

This release focused on Python 3.13 support, but also on improved compatibility, made many performance optimizations, enhanced error reporting, and better debugging support.

Bug Fixes
  • Windows: Fixed a regression in onefile mode that incorrectly handled program and command line paths containing spaces. Fixed in 2.4.4 already.

  • Windows: Corrected an issue where console output handles were being opened with closed file handles. Fixed in 2.4.2 already.

  • Standalone: Restored the ability to use trailing slashes on the command line to specify the target directory for data files on Windows. Fixed in 2.4.2 already.

  • Compatibility: Fixed a parsing error that occurred with relative imports in .pyi files, which could affect some extension modules with available source code. Fixed in 2.4.3 already.

  • Modules: Ensured that extension modules load correctly into packages when using Python 3.12. Fixed in 2.4.4 already.

  • Windows: Improved command line handling for onefile mode to ensure full compatibility with quoting. Fixed in 2.4.4 already.

  • Data Directories: Allowed the use of non-normalized paths on the command line when specifying data directories. Fixed in 2.4.5 already.

  • Python 3.11+: Fixed an issue where inspect module functions could raise StopIteration when examining compiled functions on the stack. Fixed in 2.4.5 already.

  • importlib_metadata: Improved compatibility with importlib_metadata by handling cases where it might be broken, preventing potential compilation crashes. Fixed in 2.4.5 already.

  • Plugins: Fixed a crash that occurred when using the no_asserts YAML configuration option. Fixed in 2.4.6 already.

  • Scons: Improved error tolerance when reading ccache log files to prevent crashes on Windows caused by non-ASCII module names or paths. Fixed in 2.4.11 already.

  • Scons: Prevented the C standard option from being applied to C++ compilers, resolving an issue with the splash screen on Windows when using Clang. Fixed in 2.4.8 already.

  • macOS: Enhanced handling of DLL self-dependencies to accommodate cases where DLLs use both .so and .dylib extensions for self-references. Fixed in 2.4.8 already.

  • Compatibility: Fixed a memory leak that occurred when using deepcopy on compiled methods. Fixed in 2.4.9 already.

  • MSYS2: Excluded the bin directory from being considered a system DLL folder when determining DLL inclusion. Fixed in 2.4.9 already.

  • Python 3.10+: Fixed a crash that could occur when a match statement failed to match a class with arguments. Fixed in 2.4.9 already.

  • MSYS2: Implemented a workaround for non-normalized paths returned by os.path.normpath in MSYS2 Python environments. Fixed in 2.4.11 already.

  • Python 3.12: Resolved an issue where Nuitka’s constant code was triggering assertions in Python 3.12.7. Fixed in 2.4.10 already.

  • UI: Ensured that the --include-package option includes both Python modules and extension modules that are sub-modules of the specified package. Fixed in 2.4.11 already.

  • Windows: Prevented encoding issues with CMD files used for accelerated mode on Windows.

  • Standalone: Improved the standard library scan to avoid assuming the presence of specific files, which might have been deleted by the user or a Python distribution.

  • Compatibility: Added suffix, suffixes, and stem attributes to Nuitka resource readers to improve compatibility with file objects.

  • Compatibility: Backported the error message change for yield from used at the module level, using dynamic detection instead of hardcoded text per version.

  • Compatibility: Fixed an issue where calling built-in functions with keyword-only arguments could result in errors due to incorrect argument passing.

  • Compatibility: Fixed reference leaks that occurred when using list.insert and list.index with 2 or 3 arguments.

  • Windows: Prioritized relative paths over absolute paths for the result executable when absolute paths are not file system encodable. This helps address issues related to non-ASCII short paths on some Chinese systems.

  • Compatibility: Improved compatibility with C extensions by handling cases where the attribute slot is not properly implemented, preventing potential segfaults.

  • Compatibility: Prevent the leakage of sys.frozen when using the multiprocessing module and its plugin, resolving a long-standing TODO and potentially breaking compatibility with packages that relied on this behavior.

  • Compatibility: Fixed an issue where matching calls with keyword-only arguments could lead to incorrect optimization and argument passing errors.

  • Compatibility: Corrected the handling of iterators in for loops to avoid assuming the presence of slots, preventing potential issues.

  • macOS: Added support for cyclic DLL dependencies, where DLLs have circular references.

  • Compatibility: Ensured the use of updated expressions during optimization phase for side effects to prevent crashes caused by referencing obsolete information.

  • Python 3.10+: Fixed a crash that could occur in complex cases when re-formulating match statements.

  • Python 3.4-3.5: Corrected an issue in Nuitka’s custom PyDict_Next implementation that could lead to incorrect results in older Python 3 versions.

  • Python 3.10+: Ensured that AttributeError is raised with the correct keyword arguments, avoiding a TypeError that occurred previously.

  • Plugins: Added a data file function that avoids loading packages, preventing potential crashes caused by incompatible dependencies (e.g., numpy versions).

  • Compatibility: Ensured that Nuitka’s package reader closes data files after reading them to prevent resource warnings in certain Python configurations.

  • Standalone: Exposed setuptools contained vendor packages in standalone distributions to match the behavior of the setuptools package.

  • Accelerated Mode: Enabled the django module parameter in accelerated mode to correctly detect used extensions.

  • Compatibility: Prevented resource warnings for unclosed files when trace outputs are sent to files via command line options.

  • Compatibility: Enabled the use of xmlrpc.server without requiring the pydoc module.

  • Plugins: Fixed an issue in the anti-bloat configuration where change_function and change_classes ignored “when” clauses, leading to unintended changes.

  • Python 3.12 (Linux): Enhanced static libpython handling for Linux. Static libpython is now used only when the inline copy is available (not in official Debian packages). The inline copy of hacl is used for all Linux static libpython uses with Python 3.12 or higher.

  • Standalone: Further improved the standard library scan to avoid assuming the presence of files that might have been manually deleted.

  • UI: Fixed the --include-raw-dir option, which was not functioning correctly. Only the Nuitka Package configuration was being used previously.

Package Support
  • arcade: Improved standalone configuration for the arcade package. Added in 2.4.3 already.

  • license-expression: Added a missing data file for the license-expression package in standalone distributions. Added in 2.4.6 already.

  • pydantic: Included a missing implicit dependency required for deprecated decorators in the pydantic package to function correctly in standalone mode. Fixed in 2.4.5 already.

  • spacy: Added a missing implicit dependency for the spacy package in standalone distributions. Added in 2.4.7 already.

  • trio: Updated standalone support for newer versions of the trio package. Added in 2.4.8 already.

  • tensorflow: Updated standalone support for newer versions of the tensorflow package. Added in 2.4.8 already.

  • pygame-ce: Added standalone support for the pygame-ce package. Added in 2.4.8 already.

  • toga: Added standalone support for newer versions of the toga package on Windows. Added in 2.4.9 already.

  • django: Implemented a workaround for a django debug feature that attempted to extract column numbers from compiled frames. Added in 2.4.9 already.

  • PySide6: Improved standalone support for PySide6 on macOS by allowing the recognition of potentially unusable plugins. Added in 2.4.9 already.

  • polars: Added a missing dependency for the polars package in standalone distributions. Added in 2.4.9 already.

  • django: Enhanced handling of cases where the django settings module parameter is absent in standalone distributions. Added in 2.4.9 already.

  • win32ctypes: Included missing implicit dependencies for win32ctypes modules on Windows in standalone distributions. Added in 2.4.9 already.

  • arcade: Added a missing data file for the arcade package in standalone distributions. Added in 2.4.9 already.

  • PySide6: Allowed PySide6 extras to be optional on macOS in standalone distributions, preventing complaints about missing DLLs when they are not installed. Added in 2.4.11 already.

  • driverless-selenium: Added standalone support for the driverless-selenium package. Added in 2.4.11 already.

  • tkinterdnd2: Updated standalone support for newer versions of the tkinterdnd2 package. Added in 2.4.11 already.

  • kivymd: Updated standalone support for newer versions of the kivymd package. Added in 2.4.11 already.

  • gssapi: Added standalone support for the gssapi package. Added in 2.4.11 already.

  • azure.cognitiveservices.speech: Added standalone support for the azure.cognitiveservices.speech package on macOS.

  • mne: Added standalone support for the mne package.

  • fastapi: Added a missing dependency for the fastapi package in standalone distributions.

  • pyav: Updated standalone support for newer versions of the pyav package.

  • py_mini_racer: Added standalone support for the py_mini_racer package.

  • keras: Improved standalone support for keras by extending its sub-modules path to include the keras.api sub-package.

  • transformers: Updated standalone support for newer versions of the transformers package.

  • win32com.server.register: Updated standalone support for newer versions of the win32com.server.register package.

  • Python 3.12+: Added support for distutils in setuptools for Python 3.12 and later.

  • cv2: Enabled automatic scanning of missing implicit imports for the cv2 package in standalone distributions.

  • lttbc: Added standalone support for the lttbc package.

  • win32file: Added a missing dependency for the win32file package in standalone distributions.

  • kivy: Fixed an issue where the kivy clipboard was not working on Linux due to missing dependencies in standalone distributions.

  • paddleocr: Added missing data files for the paddleocr package in standalone distributions.

  • playwright: Added standalone support for the playwright package with a new plugin.

  • PySide6: Allowed PySide6 extras to be optional on macOS in standalone distributions, preventing complaints about missing DLLs when they are not installed.

New Features
  • Python 3.13: Added experimental support for Python 3.13.

    Warning

    Python 3.13 support is not yet recommended for production use due to limited testing. On Windows, only MSVC and ClangCL are currently supported due to workarounds needed for incompatible structure layouts.

  • UI: Introduced a new --mode selector to replace the options --standalone, --onefile, --module, and --macos-create-app-bundle.

    Note

    The app mode creates an app bundle on macOS and a onefile binary on other operating systems to provide the best deployment option for each platform.

  • Windows: Added a new hide choice for the --windows-console-mode option. This generates a console program that hides the console window as soon as possible, although it may still briefly flash.

  • UI: Added the --python-flag=-B option to disable the use of bytecode cache (.pyc) files during imports. This is mainly relevant for accelerated mode and dynamic imports in non-isolated standalone mode.

  • Modules: Enabled the generation of type stubs (.pyi files) for compiled modules using an inline copy of stubgen. This provides more accurate and informative type hints for compiled code.

    Note

    Nuitka also adds implicit imports to compiled extension modules, ensuring that dependencies are not hidden.

  • Plugins: Changed the data files configuration to a list of items, allowing the use of when conditions for more flexible control. Done in 2.4.6 already.

  • Onefile: Removed the MSVC requirement for the splash screen in onefile mode. It now works with MinGW64, Clang, and ClangCL. Done for 2.4.8 already.

  • Reports: Added information about the file system encoding used during compilation to help debug encoding issues.

  • Windows: Improved the attach mode for --windows-console-mode when forced redirects are used.

  • Distutils: Added the ability to disable Nuitka in pyproject.toml builds using the build_with_nuitka setting. This allows falling back to the standard build backend without modifying code or configuration. This setting can also be passed on the command line using --config-setting.

  • Distutils: Added support for commercial file embedding in distutils packages.

  • Linux: Added support for using uninstalled self-compiled Python installations on Linux.

  • Plugins: Enabled the matplotlib plugin to react to active Qt and tkinter plugins for backend selection.

  • Runtime: Added a new original_argv0 attribute to the __compiled__ value to provide access to the original start value of sys.argv[0], which might be needed by applications when Nuitka modifies it to an absolute path.

  • Reports: Added a list of DLLs that are actively excluded because they are located outside of the PyPI package.

  • Plugins: Allowed plugins to override the compilation mode for standard library modules when necessary.

Optimization
  • Performance: Implemented experimental support for “dual types”, which can significantly speed up integer operations in specific cases (achieving speedups of 12x or more in some very specific loops). This feature is still under development but shows promising potential for future performance gains, esp. when combined with future PGO (Profile Guided Optimization) work revealing likely runtime types more often and more types being covered.

  • Performance: Improved the speed of module variable access.

    • For Python 3.6 to 3.10, this optimization utilizes dictionary version tags but may be less effective when module variables are frequently written to.

    • For Python 3.11+, it relies on dictionary key versions, making it less susceptible to dictionary changes but potentially slightly slower for cache hits compared to Python 3.10.

  • Performance: Accelerated string dictionary lookups for Python 3.11+ by leveraging knowledge about the key and the module dictionary’s likely structure. This also resolves a previous TODO item, where initial 3.11 support was not as fast as our support for 3.10 was in this domain.

  • Performance: Optimized module dictionary updates to occur only when values actually change, improving caching efficiency.

  • Performance: Enhanced exception handling by removing bloat in the abstracted differences between Python 3.12 and earlier versions. This simplifies the generated C code, reduces conversions, and improves efficiency for all Python versions. This affects both C compile time and runtime performance favorably and solves a huge TODO for Python 3.12 performance.

  • Performance: Removed the use of CPython APIs calls for accessing exception context and cause values, which can be slow.

  • Performance: Utilized Nuitka’s own faster methods for creating int and long values, avoiding slower CPython API calls.

  • Performance: Implemented a custom variant of _PyGen_FetchStopIterationValue to avoid CPython API calls in generator handling, further improving performance on generators, coroutines and asyncgen.

  • Windows: Aligned with CPython’s change in reference counting implementation on Windows for Python 3.12+, which improves performance with LTO (Link Time Optimization) enabled.

  • Optimization: Expanded static optimization to include unary operations, improving the handling of number operations and preparing for full support of dual types.

  • Optimization: Added static optimization for os.stat and os.lstat calls.

  • Performance: Passed the exception state directly into unpacking functions, eliminating redundant exception fetching and improving code efficiency.

  • Performance: Introduced a dedicated helper for unpacking length checks, resulting in faster and more compact code helping scalability as well.

  • Performance: Generated more efficient code for raising built-in exceptions by directly creating them through the base exception’s new method instead of calling them as functions. This can speed up some things by a lot.

  • Performance: Optimized exception creation by avoiding unnecessary tuple allocations for empty exceptions. This hack avoids hitting the memory allocator as much.

  • Performance: Replaced remaining uses of PyTuple_Pack with Nuitka’s own helpers to avoid CPython API calls.

  • Code Generation: Replaced implicit exception raise nodes with direct exception creation nodes for improved C code generation.

  • Windows: Aligned with CPython’s change in managing object reference counters on Windows for Python 3.12+, improving performance with LTO enabled.

  • Performance: Removed remaining CPython API calls when creating int values in various parts of the code, including specialization code, helpers, and constants loading.

  • Windows: Avoided scanning for DLLs in the PATH environment variable when they are not intended to be used from the system. This prevents potential crashes related to non-encodable DLL paths and makes those scans faster too.

  • Windows: Updated to a newer MinGW64 version from 13.2 to 14.2 for potentially improved binary code generation with that compiler.

  • Code Size: Reduced the size of constant blobs by avoiding module-level constants for the global values -1, 0, and 1.

  • Code Generation: Improved code generation for variables by directly placing NameError exceptions into the thread state when raised, making for more compact C code.

  • Optimization: Statically optimized the sys.ps1 and sys.ps2 values to not exist (unless in module mode), potentially enabling more static optimization in packages that detect interactive usage checking them.

  • Performance: Limited the use of tqdm locking to no-GIL and Scons builds where threading is actively used.

  • Optimization: Implemented a faster check for non-frame statement sequences by decoupling frames and normal statement sequences and using dedicated accessors. This improves performance during the optimization phase.

Anti-Bloat
  • Prevented the inclusion of importlib_metadata for the numpy package. Added in 2.4.2 already.

  • Avoided the use of dask in the pandera package. Added in 2.4.5 already.

  • Removed numba for newer versions of the shap package. Added in 2.4.6 already.

  • Prevented attempts to include both Python 2 and Python 3 code for the aenum package, avoiding SyntaxError warnings. Added in 2.4.7 already.

  • Enhanced handling for the sympy package. Added in 2.4.7 already.

  • Allowed pydoc for the pyqtgraph package. Added in 2.4.7 already.

  • Avoided pytest in the time_machine package. Added in 2.4.9 already.

  • Avoided pytest in the anyio package.

  • Avoided numba in the pandas package.

  • Updated anti-bloat measures for newer versions of the torch package with increased coverage.

  • Avoided pygame.tests and cv2 for the pygame package.

  • Allowed unittest in the absl.testing package.

  • Allowed setuptools in the tufup package.

  • Avoided test modules when using the bsdiff4 package.

  • Treated the use of the wheel module the same as using the setuptools package.

Organizational
  • Development Environment: Added experimental support for a devcontainer to the repository, providing an easier way to set up a Linux-based development environment. This feature is still under development and may require further refinement.

  • Issue Reporting: Clarified the issue reporting process on GitHub, emphasizing the importance of testing reproducers against Python first to ensure the issue is related to Nuitka.

  • Issue Reporting: Discouraged the use of --deployment in issue reports, as it hinders the automatic identification of issues, that should be the first thing to remove.

  • UI: Improved the clarity of help message of the option for marking data files as external, emphasizing that files must be included before being used.

  • UI: Added checks to the Qt plugins to ensure that specified plugin families exist, preventing unnoticed errors.

  • UI: Implemented heuristic detection of terminal link support, paving the way for adding links to options and groups in the command line interface.

  • UI: Removed obsolete caching-related options from the help output, as they have been replaced by more general options.

  • Plugins: Improved error messages when retrieving information from packages during compilation.

  • Quality: Implemented a workaround for an isort bug that prevented it from handling UTF-8 comments.

  • Quality: Updated GitHub actions to use clang-format-20.

  • Quality: Updated to the latest version of black for code formatting.

  • Release Process: Updated the release script tests for Debian and PyPI to use the correct runner names. (Changed in 2.4.1 already.

  • UI: Disabled progress bar locking, as Nuitka currently doesn’t utilize threads.

  • UI: Added heuristic detection of terminal link support and introduced an experimental terminal link as a first step towards a more interactive command line interface.

  • Debugging: Fixed a crash in the “explain reference counts” feature that could occur with unusual dict values mistaken for modules.

  • Debugging: Included reference counts of tracebacks when dumping reference counts at program end.

  • Debugging: Added assertions and traces to improve debugging of input/output handling.

  • Quality: Added checks for configuration module names in Nuitka package configuration to catch errors caused by using filenames instead of module names.

  • UI: Removed obsolete options controlling cache behavior, directing users to the more general cache options.

  • Scons: Ensured that the CC environment variable is used consistently for --version and onefile bootstrap builds, as well as the Python build, preventing inconsistencies in compiler usage and outputs.

  • Distutils: Added the compiled-package-hidden-by-package mnemonic for use in distutils to handle the expected warning when a Python package is replaced with a compiled package and the Python code is yet to be deleted.

  • Dependency Management: Started experimental support for downloading Nuitka dependencies like ordered-set. This feature is not yet ready for general use.

Tests
  • Added Python 3.13 to the GitHub Actions test matrix.

  • Significantly enhanced construct-based tests for clearer results. The new approach executes code with a boolean flag instead of generating different code, potentially leading to the removal of custom templating.

  • Removed the 2to3 conversion code from the test suite, as it is being removed from newer Python versions. Tests are now split with version requirements as needed.

  • Fixed an issue where the test runner did not discover and use Python 3.12+, resulting in insufficient test coverage for those versions on GitHub Actions.

  • Ensured that the compare_with_cpython test function defaults to executing the system’s Python interpreter instead of relying on the PYTHON environment variable.

  • Set up continuous integration with Azure Pipelines to run Nuitka tests against the factory branch on each commit.

  • Enforced the use of static libpython for construct-based tests to eliminate DLL call overhead and provide more accurate performance measurements.

  • Improved the robustness of many construct tests, making them less sensitive to unrelated optimization changes.

  • Removed a test that was only applicable to Nuitka Commercial, as it was not useful to always skip it in the standard version. Commercial tests are now also recognized by their names.

  • Added handling for segmentation faults in distutils test cases, providing debug output for easier diagnosis of these failures.

  • Prevented resource warnings for unclosed files in a reflected test.

Cleanups
  • WASI: Corrected the signatures of C function getters and setters for compiled types in WASI to ensure they match the calling conventions. Casts are now performed locally to the compiled types instead of in the function signature. Call entries also have the correct signature used by Python C code.

  • WASI: Improved code cleanliness by adhering to PyCFunction signatures in WASI.

  • Code Generation: Fixed a regression in code generation that caused misaligned indentation in some cases.

  • Code Formatting: Changed some code for identical formatting with clang-format-20 to eliminate differences between the new and old versions.

  • Caching: Enforced proper indentation in Nuitka cache files stored in JSON format.

  • Code Cleanliness: Replaced checks for Python 3.4 or higher with checks for Python 3, simplifying the code and reflecting the fact that Python 3.3 is no longer supported.

  • Code Cleanliness: Removed remaining Python 3.3 specific code from frame templates.

  • Code Cleanliness: Performed numerous spelling corrections and renamed internal helper functions for consistency and clarity.

  • Plugins: Renamed the get_module_directory helper function in the Nuitka Package configuration to remove the leading underscore, improving readability.

  • Plugins: Moved the numexpr.cpuinfo workaround to the appropriate location in the Nuitka Package configuration, resolving an old TODO item.

Summary

This a major release that brings support for Python 3.13, relatively soon after its release.

Our plugin system and Nuitka plugin configuration was used a lot for support of many more third-party packages, and numerous other enhancements in the domain of avoiding bloat.

This release focuses on improved compatibility, new break through performance optimizations, to build on in the future, enhanced error reporting, and better debugging support.

Categories: FLOSS Project Planets

EuroPython Society: List of EPS Board Candidates for 2024/2025

Planet Python - Tue, 2024-11-26 17:27

At this year’s EuroPython Society General Assembly (GA), planned for Sunday, December 1st, 2024, 20:00 CET, we will vote in a new board of the EuroPython Society for the term 2024/2025

List of Board Candidates

The EPS bylaws require one chair, one vice chair and 2 - 7 board members. The following candidates have stated their willingness to work on the EPS board. We are presenting them here (in alphabetical order by first name).

We will be updating this list in the days before the GA. Please send in any nominations or self-nominations to board@europython.eu. For more information please check our previous post here: https://europython-society.org/2024-general-assembly-announcement/

Please note that our bylaws do not restrict nominations to people on this list. It is even possible to self-nominate or nominate other candidates at the GA itself. However, in the interest of giving members a better chance to review the candidate list, we’d like to encourage all nominations to be made before the GA.

The following fine folks have expressed their desire to run for the next EPS board elections: Anders Hammarquist, Aris Nivorils, Artur Czepiel, Cyril Bitterich, Mia Bajić, Shekhar Koirala.

Anders Hammarquist

Pythonista / Consultant / Software architect

Anders is running his own Python consultancy business, AB Struse, since 2019 and is currently mostly involved with using Python in industrial automation. He has been using Python since 1995, and fosters its use in at least four companies.

He helped organize EuroPython 2004 and 2005, and has attended and given talks at several EuroPythons since then. He has handled the Swedish financials of the EuroPython Society since 2016 and has served as board member since 2017.

Aris Nivorlis

Pythonista / Geoscientist / Data Steward

Aris is a researcher at Deltares, a non-profit research institute in the Netherlands, where he combines his expertise in geoscience with his passion for Python and data stewardship to address real-world challenges. His journey with Python started during his doctoral studies, where he became a passionate advocate for the language and supported several colleagues in adopting Python for their research.

Aris has been involved in the Python community for the past four years. He is the Chair of PyCon Sweden and has been a core organizer for the past three conferences. He was a EuroPython organizer in 2024, leading the Ops Team, an experience he describes as both challenging and incredibly rewarding.

Aris is running for the EuroPython Society (EPS) Board to work in shaping its future direction. He is particularly interested on how EPS can further support local Python communities, events, and projects, while ensuring the success of the EuroPython conference. Aris aims to build on the foundation of previous efforts, working toward a more independent and sustainable organisation team for EuroPython. One of his key goals is to lower the barriers for others to get involved as volunteers, organizers, and board members, fostering a more inclusive and accessible society.

Artur Czepiel (nomination for Chair)

Software developer

I started using Python in 2008 and attended my first EuroPython in 2016, and it’s been an incredible journey ever since. Over the years, I’ve had the opportunity to contribute to five EuroPython conferences, including serving four terms on the board and chairing the conference last year.

Despite that, I still have new ideas for improvements and a couple of unfinished projects I’d like to see moving forward. :)

Cyril Bitterich

Operations dude / Organiser / Systems Engineer

Cyril’s first contact with the EuroPython community was back in 2019 in Basel during a break between workshops. Starting with helping prepare goodie bags, he went on to assist with setting up the conference location, and took on the role of a general runner during the event. After becoming a late addition to the Ops team in Dublin, he was part of the Ops and Programme teams for both Prague editions of the conference. His firefighting skills proved invaluable as he supported other teams and the general organisation, making him a de facto member of the 2024 core organisers team.

Enjoying the warmth of the community and gaining experience in a lot of different roles, it&aposs now time for him to pass on the lessons he’s learned in a more structured way.  Having taken on smaller and ad-hoc leadership roles during the conference, he now aims to play an even more active role in the year-to-year operations of the EuroPython community.

With a background in successfully playing firefighter at organising capoeira events for 100 to 2000 attendees at locations all over the world (Asia and Antarctica are still on the to-do list 😉), and working with leaders, teams and attendees from diverse cultures, his knowledge and experience extend far beyond the last 3 years of EuroPython conference.

His goal is to bring more structure to the EPS as a whole and the conference teams where needed. A key focus is ensuring effective communication, be it through improving documentation or maintaining open and regular dialogue with the teams and everyone else involved Ultimately, he aims to help strike a healthy balance between the community-driven, volunteer spirit with the  professionalism expected from an established conference like EuroPython. However, the aim is not to disrupt or rebuild everything from scratch. Instead, he seeks to build on the strong foundation established by the board members active in the last years - while shamelessly taking full advantage of their counsel along the way.

This should establish a stable foundation within the EuroPython Society that can be adopted or directly used by other local communities in Europe. On the EPS side, this will hopefully open up resources to support other communities based on their specific needs.

Mia Bajić

Software Engineer & Community Events Organizer

I’m a software engineer and community events organizer. Since joining the Python community in 2021, I’ve led Pyvo meetups, brought Python Pizza to the Czech Republic, and contributed to PyCon CZ 23 as well as EuroPython 2023 & 2024. I&aposve spoken on technical topics at major conferences, including PyCon US, DjangoCon, EuroPython and many other PyCons across Europe.

I’m running for the board to learn, grow, and give back to the community that has given me so much. The main topic I would like to focus on is sustainability. I value setting clear goals, fostering open and transparent culture, and delivering measurable results.

Inspired by the folks from the Django Software Foundation, I decided to share more about my background, motivations, and small improvements I’d like to see implemented this year on my blog: https://clytaemnestra.github.io/tech-blog/eps-elections.

Shekhar Koirala

Machine Learning Engineer

I joined EuroPython as a remote volunteer in 2022 and later became part of the onsite volunteer team in Dublin the same year. Over the past two years, I’ve worked with the Ops team, supporting other onsite volunteers. My volunteering experience also includes contributions to Python Ireland and PyData Kathmandu. This year, I attended PyData Amsterdam, PyCon Sweden, and a non-Python volunteer-led conference in Berlin, Germany, where I sought to understand what truly makes a conference great. I realized that, at the core, it’s the people and their love for the community that make a conference exceptional.

Whether or not I join the board, I will continue volunteering for EuroPython. But if I am selected as a board member, I aim to support volunteers and foster a wholesome environment, just as I have always experienced. I also want to strengthen the connection between EuroPython and the local community.

Professionally, I work as a Machine Learning Engineer at Identv, using Python for both work and hobbies. Recently, I delivered a talk and conducted a workshop at PyCon Ireland, and I’m excited to kickstart my open-source journey. Beside sitting in front of the computer, I love to hike, take photos and try not to get lost in the wild.

What does the EPS Board do ?

The EPS board is made up of up to 9 directors (including 1 chair and 1 vice chair); the board runs the day-to-day business of the EuroPython Society, including running the EuroPython conference series, and supports the community through various initiatives such as our grants programme. The board collectively takes up the fiscal and legal responsibility of the Society.

For more details you can check our previous post here: https://europython-society.org/2024-general-assembly-announcement/#what-does-the-board-do

Categories: FLOSS Project Planets

Dries Buytaert: Drupal 11 released

Planet Drupal - Tue, 2024-11-26 15:29

Today is a big day for Drupal as we officially released Drupal 11!

In recent years, we've seen an uptick in innovation in Drupal. Drupal 11 continues this trend with many new and exciting features. You can see an overview of these improvements in the video below:

Drupal 11 has been out for less than six hours, and updating my personal site was my first order of business this morning. I couldn't wait! Dri.es is now running on Drupal 11.

I'm particularly excited about two key features in this release, which I believe are transformative and will likely reshape Drupal in the years ahead:

  1. Recipes (experimental): This feature allows you to add new features to your website by applying a set of predefined configurations.
  2. Single-Directory Components: SDCs simplify front-end development by providing a component-based workflow where all necessary code for each component lives in a single, self-contained directory.

These two new features represent a major shift in how developers and site builders will work with Drupal, setting the stage for even greater improvements in future releases. For example, we'll rely heavily on them in Drupal Starshot.

Drupal 11 is the result of contributions from 1,858 individuals across 590 organizations. These numbers show how strong and healthy Drupal is. Community involvement remains one of Drupal's greatest strengths. Thank you to everyone who contributed to Drupal 11!

Categories: FLOSS Project Planets

PyCoder’s Weekly: Issue #657 (Nov. 26, 2024)

Planet Python - Tue, 2024-11-26 14:30

#657 – NOVEMBER 26, 2024
View in Browser »

NumPy Practical Examples: Useful Techniques

In this tutorial, you’ll learn how to use NumPy by exploring several interesting examples. You’ll read data from a file into an array and analyze structured arrays to perform a reconciliation. You’ll also learn how to quickly chart an analysis and turn a custom function into a vectorized function.
REAL PYTHON

Loop Targets

Loop assignment allows you to assign to a dict item in a for loop. This post covers what that means and that it is no more costly than regular assignment.
NED BATCHELDER

Introducing the Windsurf Editor. Tomorrow’s IDE, Today

Introducing the Windsurf Editor, the first agentic IDE. All the features you know and love from Codeium’s extensions plus new capabilities such as Cascade that act as collaborative AI agents, combining the best of copilot and agent systems →
CODEIUM sponsor

Vector Animations With Python

This post shows you how to use gizeh and moviepy to create animations using vector graphics.
DEEPNOTE.COM

Take the 2024 Django Developers Survey

DJANGO SOFTWARE FOUNDATION

Python 3.14.0 Alpha 2 Released

CPYTHON DEV BLOG

Quiz: Python Dictionary Comprehensions

REAL PYTHON

PyTexas (Austin) Call for Proposals

PYTEXAS.ORG

Quiz: Namespaces and Scope in Python

REAL PYTHON

Articles & Tutorials Avoid Counting in Django Pagination

Django’s Paginator class separates data into chunks to display pages of results. By default, underneath it uses a SQL COUNT(*) call which for large amounts of data can be expensive. This article shows you how to get around that.
NIK TOMAZIC

Working With TOML and Python

TOML is a configuration file format that’s becoming increasingly popular in the Python community. In this video course, you’ll learn the syntax of TOML and explore how you can work with TOML files in your own projects.
REAL PYTHON course

Categories of Leadership on Technical Teams

Understanding the different types of technical leadership can help you better structure teams and define roles and responsibilities. This post talks about the various types of technical leaders you might encounter.
BEN KUHN

Quick Prototyping With Sqlite3

Python’s sqlite3 bindings makes it a great tool for quick prototyping while you’re working out what it is exactly that you’re building and what kind of database schema makes sense.
JUHA-MATTI SANTALA

Django Performance and Optimization

“This document provides an overview of techniques and tools that can help get your Django code running more efficiently - faster, and using fewer system resources.”
DJANGO SOFTWARE FOUNDATION

Python Dependency Management Is a Dumpster Fire

Managing dependencies in Python can be a bit of a challenge. This deep dive article shows you all the problems and how the problems are mitigated if not solved.
NIELS CAUTAERTS

Under the Microscope: Ecco the Dolphin

This article shows the use of Ghidra and Python to reverse engineer the encoding scheme of the Sega Dreamcast game Eccho the Dolphin.
32BITS

Is Python Really That Slow?

The speed of Python is an age old conversation. This post talks about just what it does and doesn’t mean.
MIGUEL GRINBERG

Threads Beat Async/Await

Further musings about async & await and why Armin thinks virtual threads are a better model.
ARMIN RONACHER

Python for R Users

Resources for getting better at Python for experienced R developers
STEPHEN TURNER

Projects & Code nbformat: Base Implementation of Jupyter Notebook Format

PYPI.ORG

pex: Generate Python EXecutable Files, Lock Files and Venvs

GITHUB.COM/PEX-TOOL

Async, Pure-Python Server-Side Rendering Engine

VOLFPETER.GITHUB.IO • Shared by Peter Volf

django-tasks: Background Workers Reference Implementation

GITHUB.COM/REALORANGEONE

dsRAG: Retrieval Engine for Unstructured Data

GITHUB.COM/D-STAR-AI

Events Weekly Real Python Office Hours Q&A (Virtual)

November 27, 2024
REALPYTHON.COM

Python New Zealand: Mapping Hurricane Wind Speeds From Space

November 28, 2024
MEETUP.COM

SPb Python Drinkup

November 28, 2024
MEETUP.COM

PyCon Wroclaw 2024

November 30 to December 1, 2024
PYCONWROCLAW.COM

PyDay Togo 2024

November 30 to December 1, 2024
LU.MA

PyDelhi User Group Meetup

November 30, 2024
MEETUP.COM

Happy Pythoning!
This was PyCoder’s Weekly Issue #657.
View in Browser »

[ Subscribe to 🐍 PyCoder’s Weekly 💌 – Get the best Python news, articles, and tutorials delivered to your inbox once a week >> Click here to learn more ]

Categories: FLOSS Project Planets

MidCamp - Midwest Drupal Camp: Session Submission Now Open for MidCamp 2025!

Planet Drupal - Tue, 2024-11-26 11:25
Session Submission Now Open for MidCamp 2025!

As the season of gratitude approaches, we’re excited to celebrate you—our future speakers! If you've got an idea for a session, now's the time to get involved in MidCamp 2025, happening May 20-22 in Chicago.

Call for Speakers

Since 2014, MidCamp has hosted over 300 amazing sessions, and we’re ready to add your talk to that legacy. We’re seeking presentations for all skill levels—from Drupal beginners to advanced users, as well as end users and business professionals. Check out our session tracks for full details on the types of talks we’re looking for.

Not quite ready? No worries! Join us for one of our Speaker Workshops:

  • December 2024 (TBD): Crafting an Outstanding Proposal
  • March 2025 (TBD): Polishing Your Presentation (Open to both MidCamp and DrupalCon Atlanta 2025 presenters!)
Key Dates
  • Session Proposals Open: November 25, 2024

Submit your session now!

  • Proposal Deadline: January 12, 2025
  • Speakers Notified: Week of February 17, 2025
  • MidCamp Sessions: May 20-21, 2025
P.S.

We hear Florida is also calling for submissions—but let’s be real, we know where your heart lies. 😊

Sponsor MidCamp

Looking to connect with the Drupal community? Sponsoring MidCamp is the way to do it! With packages starting at $600, there are opportunities to suit any budget. Whether you’re recruiting talent, growing your brand, or simply supporting the Drupal ecosystem, MidCamp sponsorship offers great value.

Act early to maximize your exposure!

Learn more about sponsorship opportunities

Stay in the Loop

Don’t miss a beat!

Keep an eye on our news page and subscribe to our newsletter for updates on the venue, travel options, social events, and speaker announcements.

Ready to submit your session? Let’s make MidCamp 2025 unforgettable!

Categories: FLOSS Project Planets

Web Search Keywords

Planet KDE - Tue, 2024-11-26 10:32

Did you know about a small but very useful feature from KDE?

Open krunner via Alt+Space and type qw:KDE to search Qwant for KDE:

Pressing Enter will open up your browser with the specified KDE search on Qwant!

There are a lot of other Web Search Keywords like:

Wikipedia:

Invent:

Translate from English to German on dict.cc

You can find all of them by opening Web Search Keywords on krunner:

Extra: Create your own!

I use often a Fedora tool called COPR, so let’s use it as an example to create our own web search keyword.

Do your search in the webpage:

And now the important part you need:

Now back to the Web Search Keyword settings:

Fill in the data needed taking care of the placeholder for our input!:

Now we have our own Web Search Keyword:

NOTE: for some reason I had to click on Apply and OK until all the different setting windows were closed before the new custom Web Search Keyword worked

And the result:

I hope it’s useful to somebody!

Categories: FLOSS Project Planets

Matt Glaman: Restrict Composer dependency updates to only patch releases

Planet Drupal - Tue, 2024-11-26 09:00

I was doing website maintenance and checked for outdated dependencies with composer outdated. I usually filter with -D for checking direct dependencies and -p for packages with patch releases. These are typically easy pickings. I saw I was on 2.1.3 of the Honeypot module and 2.1.4 was available. So I ran composer update drupal/honeypot. I noticed the module was updated to 2.2.0, because my Composer constraint is drupal/honeypot: ^2.0, allowing minor updates. I figured that was fine enough. Turns out it wasn't.

Categories: FLOSS Project Planets

Real Python: Managing Dependencies With Python Poetry

Planet Python - Tue, 2024-11-26 09:00

When your Python project relies on external packages, you need to make sure you’re using the right version of each package. After an update, a package might not work as it did before. A dependency manager like Python Poetry helps you specify, install, and resolve external packages in your projects. This way, you can be sure that you always work with the correct dependency version on every machine.

In this video course, you’ll learn how to:

  • Create a new project using Poetry
  • Add Poetry to an existing project
  • Configure your project through pyproject.toml
  • Pin your project’s dependency versions
  • Install dependencies from a poetry.lock file
  • Run basic Poetry commands using the Poetry CLI

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Highlights from the Digital Public Goods Alliance Annual Members Meeting 2024

Open Source Initiative - Tue, 2024-11-26 06:45

This month, I had the privilege of representing the Open Source Initiative at the Digital Public Goods Alliance (DPGA) Annual Members Meeting. Held in Singapore, this event marked our second year participating as members, following our first participation in Ethiopia. It was an inspiring gathering of innovators, developers and advocates working to create a thriving ecosystem for Digital Public Goods (DPGs) as part of UNICEF’s initiative to advance the United Nations’ sustainable development goals (SDGs).

Keynote insights

The conference began with an inspiring keynote by Liv Marte Nordhaug, CEO of the DPGA, who made a call for governments and organizations to incorporate:

  1. Open Source first principles
  2. Open data at scale
  3. Interoperable Digital Public Infrastructure (DPI)
  4. DPGs as catalysts for climate change action

These priorities underscored the critical role of DPGs in fostering transparency, accountability and innovation across sectors.

DPGs and Open Source AI

A standout feature of the first day of the event was the Open Source AI track led by Amreen Taneja, DPGA Standards Lead, which encompassed three dynamic sessions:

  1. Toward AI Democratization with Digital Public Goods: This session explored the role of DPGs in contributing to the democratization of AI technologies, including AI use, development and governance, to ensure that everyone benefits from AI technology equally.
  2. Fully Open Public Interest AI with Open Data: This session highlighted the need for open AI infrastructure supported by accessible, high-quality datasets, especially in the global majority countries. Discussions evolved over how open training data sets ought to be licensed to ensure open, public interest AI.
  3. Creating Public Value with Public AI: This session examined real-world applications of generative AI in public services. Governments and NGOs showcased how AI-enabled tools can effectively tackle social challenges, leveraging Open Source solutions within the AI stack.
DPG Product Fair: showcasing innovation

The second day of the event was marked by the DPG Product Fair, which provided a science-fair-style platform for showcasing DPGs. Notable examples included:

  • India’s eGov DIGIT Open Source platform, serving over 1 billion citizens with robust digital infrastructure.
  • Singapore’s Open Government Products, which leverages Open Source and enables open collaboration to allow this small nation to expand their impact together with other southeast Asian nations.

One particularly engaging session was Sarah Espaldon’s “Improving Government Services with Legos.” This presentation from the Singapore government highlighted the benefits of modular DPGs in enhancing service delivery and building flexible DPI capabilities.

Privacy best practices for DPGs

A highlight from the third and final day of the event was the privacy-focused workshop co-hosted by the DPGA and the Open Knowledge Foundation. As privacy becomes a central concern for DPGs, the DPGA introduced its Standard Expert Group to refine privacy requirements and develop best practice guidelines. The interactive session provided invaluable feedback, driving forward the development of robust privacy standards for DPGs.

Looking ahead

The event reaffirmed the potential of Open Source technologies to transform global public goods. As we move forward, the Open Source Initiative is committed to advancing these conversations, including around the Open Source AI Definition and fostering a more inclusive digital ecosystem. A special thanks to the dedicated DPGA team—Liv Marte Nordhaug, Lucy Harris, Max Kintisch, Ricardo Miron, Luciana Amighini, Bolaji Ayodeji, Lea Gimpel, Pelin Hizal Smines, Jon Lloyd, Carol Matos, Amreen Taneja, and Jameson Voisin—whose efforts made this conference a success. We look forward to another year of impactful collaboration and to the Annual Members Meeting next year in Brazil!

Categories: FLOSS Research

Pages