Feeds

FSF Blogs: There are plenty of ways to socialize at LibrePlanet 2024: Cultivating Community

GNU Planet! - Wed, 2024-04-03 11:10
In this blog, we're sharing with you all the ways you can socialize and participate in LibrePlanet 2024: Cultivating Community outside of the official program.
Categories: FLOSS Project Planets

There are plenty of ways to socialize at LibrePlanet 2024: Cultivating Community

FSF Blogs - Wed, 2024-04-03 11:10
In this blog, we're sharing with you all the ways you can socialize and participate in LibrePlanet 2024: Cultivating Community outside of the official program.
Categories: FLOSS Project Planets

Real Python: Install and Execute Python Applications Using pipx

Planet Python - Wed, 2024-04-03 10:00

A straightforward way to distribute desktop and command-line applications written in Python is to publish them on the Python Package Index (PyPI), which hosts hundreds of thousands of third-party packages. Many of these packages include runnable scripts, but using them requires decent familiarity with the Python ecosystem. With pipx, you can safely install and execute such applications without affecting your global Python interpreter.

In this tutorial, you’ll learn how to:

  • Turn the Python Package Index (PyPI) into an app marketplace
  • Run installed applications without explicitly calling Python
  • Avoid dependency conflicts between different applications
  • Try throw-away applications in temporary locations
  • Manage the installed applications and their environments

To fully benefit from this tutorial, you should feel comfortable around the terminal. In particular, knowing how to manage Python versions, create virtual environments, and install third-party modules in your projects will go a long way.

Note: If you’re a Windows user, then it’s highly recommended you follow our Python coding setup guide before plunging into this tutorial. The gist of it is that you should avoid installing Python from the Microsoft Store, as it could prevent pipx from working correctly.

To help you get to grips with pipx, you can download the supplemental materials, which include a handy command cheat sheet. Additionally, you can test your understanding by taking a short quiz.

Get Your Cheatsheet: Click here to download the free cheatsheet of pipx commands you can use to install and execute Python applications.

Take the Quiz: Test your knowledge with our interactive “Install and Execute Python Applications Using pipx” quiz. Upon completion you will receive a score so you can track your learning progress over time:

Take the Quiz »

Get Started With pipx

On the surface, pipx resembles pip because it also lets you install Python packages from PyPI or another package index. However, unlike pip, it doesn’t install packages into your system-wide Python interpreter or even an activated virtual environment. Instead, it automatically creates and manages virtual environments for you to isolate the dependencies of every package that you install.

Additionally, pipx adds symbolic links to your PATH variable for every command-line script exposed by the installed packages. As a result, you can invoke those scripts directly from the command line without explicitly running them through the Python interpreter.

Think of pipx as Python’s equivalent of npx in the JavaScript ecosystem. Both tools let you install and execute third-party modules in the command line just as if they were standalone applications. However, not all modules are created equal.

Broadly speaking, you can classify the code distributed through PyPI into three categories:

  1. Importable: It’s either pure-Python source code or Python bindings of compiled shared objects that you want to import in your Python projects. Typically, they’re libraries like Requests or Polars, providing reusable pieces of code to help you solve a common problem. Alternatively, they might be frameworks like FastAPI or PyGame that you build your applications around.
  2. Runnable: These are usually command-line utility tools like black, isort, or flake8 that assist you during the development phase. They could also be full-fledged applications like bpython or the JupyterLab environment, which is primarily implemented in a foreign TypeScript programming language.
  3. Hybrid: They combine both worlds by providing importable code and runnable scripts at the same time. Flask and Django are good examples, as they offer utility scripts while remaining web frameworks for the most part.

Making a distribution package runnable or hybrid involves defining one or more entry points in the corresponding configuration file. Historically, these would be setup.py or setup.cfg, but modern build systems in Python should generally rely on the pyproject.toml file and define their entry points in the [project.scripts] TOML table.

Note: If you use Poetry to manage your project’s dependencies, then you can add the appropriate script declarations in the tool-specific [tool.poetry.scripts] table.

Each entry point represents an independent script that you can run by typing its name at the command prompt. For example, if you’ve ever used the django-admin command, then you’ve called out an entry point to the Django framework.

Note: Don’t confuse entry points, which link to individual functions or callables in your code, with runnable Python packages that rely on the __main__ module to provide a command-line interface.

For example, Rich is a library of building blocks for creating text-based user interfaces in Python. At the same time, you can run this package with python -m rich to display a demo application that illustrates various visual components at your fingertips. Despite this, pipx won’t recognize it as runnable because the library doesn’t define any entry points.

To sum up, the pipx tool will only let you install Python packages with at least one entry point. It’ll refuse to install runnable packages like Rich and bare-bones libraries that ship Python code meant just for importing.

Once you identify a Python package with entry points that you’d like to use, you should first create and activate a dedicated virtual environment as a best practice. By keeping the package isolated from the rest of your system, you’ll eliminate the risk of dependency conflicts across various projects that might require the same Python library in different versions. Furthermore, you won’t need the superuser permissions to install the package.

Deciding where and how to create a virtual environment and then remembering to activate it every time before running the corresponding script can become a burden. Fortunately, pipx automates these steps and provides even more features that you’ll explore in this tutorial. But first, you need to get pipx running itself.

Test Drive pipx Without Installation

If you’re unsure whether pipx will address your needs and would prefer not to commit to it until you’ve properly tested the tool, then there’s good news! Thanks to a self-contained executable available for download, you can give pipx a spin without having to install it.

To get that executable, visit the project’s release page on the official GitHub repository in your web browser and grab the latest version of a file named pipx.pyz. Files with the .pyz extension represent runnable Python ZIP applications, which are essentially ZIP archives containing Python source code and some metadata, akin to JAR files in Java. They can optionally vendor third-party dependencies that you’d otherwise have to install by hand.

Note: Internally, the pipx project uses shiv to build its Python ZIP application. When you first run a Python ZIP application that was built with shiv, it’ll unpack itself into a hidden folder named .shiv/ located in your user’s home directory. As a result, subsequent runs of the same application will reuse the already extracted files, speeding up the startup time.

Afterward, you can run pipx.pyz by passing the path to your downloaded copy of the file to your Python interpreter—just as you would with a regular Python script:

Read the full article at https://realpython.com/python-pipx/ »

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Django Weblog: Django bugfix release issued: 5.0.4

Planet Python - Wed, 2024-04-03 09:52

Today we've issued the 5.0.4 bugfix release.

The release package and checksums are available from our downloads page, as well as from the Python Package Index. The PGP key ID used for this release is Natalia Bidart: 2EE82A8D9470983E.

Django 3.2 has reached the end of extended support

Note that with this release, Django 3.2 has reached the end of extended support. All Django 3.2 users are encouraged to upgrade to Django 4.2 or later to continue receiving fixes for security issues.

See the downloads page for a table of supported versions and the future release schedule.

Categories: FLOSS Project Planets

Robin Wilson: Simple self-hosted OpenStreetMap routing using Valhalla and Docker

Planet Python - Wed, 2024-04-03 06:39

I came up with an interesting plan for an artistic map recently (more on that when I’ve finished working on it), and to create it I needed to be able to calculate a large number of driving routes around Southampton, my home city.

Specifically, I needed to be able to get lines showing the driving route from A to B for around 5,000 combinations of A and B. I didn’t want to overload a free hosted routing server, or have to deal with rate limiting – so I decided to look into running some sort of routing service myself, and it was actually significantly easier than I thought it would be.

It so happened that I’d come across a talk recently on a free routing engine called Valhalla. I hadn’t actually watched the talk yet (it is on my ever-expanding list of talks to watch), but it had put the name into my head – so I started investigating Valhalla. It seemed to do what I wanted, so I started working out how to run it locally. Using Docker, it was nice and easy – and to make it even easier for you, here’s a set of instructions.

  1. Download an OpenStreetMap extract for your area of interest. All of your routes will need to be within the area of this extract. I was focusing on Southampton, UK, so I downloaded the Hampshire extract from the England page on GeoFabrik. If you start from the home page you should be able to navigate to any region in the world.

  2. Put the downloaded file (in this case, hampshire-latest.osm.pbf) in a folder called custom_files. You can download multiple files and put them all in this folder and they will all be processed.

  3. Run the following Docker command:

    docker run -p 8002:8002 -v $PWD/custom_files:/custom_files ghcr.io/gis-ops/docker-valhalla/valhalla:latest

    This will run the valhalla Docker image, exposing port 8002 and mapping the custom_files subdirectory to /custom_files inside the container. Full docs for the Docker image are available here.

  4. You’ll see various bits of output as Valhalla processes the OSM extract, and eventually the output will stop appearing and the API server will start.

  5. Visit http://localhost:8002 and you should see an error – this is totally expected, it is just telling you that you haven’t used one of the valid API endpoints. This shows that the server is running properly.

  6. Start using the API. See the documentation for instructions on what to pass the API.

Once you’ve got the server running, it’s quite easy to call the API from Python and get the resulting route geometries as Shapely LineString objects. These can easily be put into a GeoPandas GeoDataFrame. For example:

import urllib import requests from pypolyline.cutil import decode_polyline # Set up the API request parameters - in this case, from one point # to another, via car data = {"locations":[ {"lat":from_lat,"lon":from_lon},{"lat":to_lat,"lon":to_lon}], "costing":"auto"} # Convert to JSON and make the URL path = f"http://localhost:8002/route?json={urllib.parse.quote(json.dumps(data))}" # Get the URL resp = requests.get(path) # Extract the geometry of the route (ie. the line from start to end point) # This is in the polyline format that needs decoding polyline = bytes(resp.json()['trip']['legs'][0]['shape'], 'utf8') # Decode the polyline decoded_coords = decode_polyline(polyline, 6) # Convert to a shapely LineString geom = shapely.LineString(decoded_coords)

To run this, you’ll need to install pypolyline and requests.

Note that you need to pass a second parameter of 6 into decode_polyline or you’ll get nonsense out (this parameter tells it that it is in polyline6 format, which seems to not be documented particularly well in the Valhalla documentation). Also, I’m sure there is a better way of passing JSON in a URL parameter using requests, but I couldn’t find it – whatever I did I got a JSON parsing error back from the API. The urllib.parse.quote(json.dumps(data)) code was the simplest thing I found that worked.

This code could easily be extended to work with multi-leg routes, to extract other data like the length or duration of the route, and more. The Docker image can also do more, like load public transport information, download OSM files itself and even more – see the docs for more on this.

Check back soon to see how I used this for some cool map-based art.

Categories: FLOSS Project Planets

Guido Günther: Free Software Activities March 2024

Planet Debian - Wed, 2024-04-03 06:12

A short status update of what happened on my side last month. I spent quiet a bit of time reviewing new, code (thanks!) as well as maintenance to keep things going but we also have some improvements:

Phosh Phoc phosh-mobile-settings phosh-osk-stub gmobile Livi squeekboard GNOME calls Libsoup

If you want to support my work see donations.

Categories: FLOSS Project Planets

Tag1 Consulting: Drupal Core Test Suite Improved Runtime By 10% With Gander

Planet Drupal - Wed, 2024-04-03 05:00

The Drupal community has continuously sought ways to enhance the performance and efficiency of Drupal sites. The performance testing framework Gander has been part of Drupal core since version 10.2. The result of joint efforts between the Google Chrome team and Tag1 Consulting, this powerful tool is specifically designed to optimize Drupal performance. Optimized performance ensures that sites are not only fast but also efficient and sustainable. Today, we will take a closer look at how Gander played a crucial role in improving the Drupal core test suite runtime by 10%. ## Identifying A Core Performance Issue Gander's impact on Drupal development was recently highlighted by its identification of a performance issue within Drupal core. The issue (#3410312) reported a particular code section being called redundantly during automated test runs and on live websites, resulting in delays. ### The Bottleneck Identified Drupal is designed to use the flood system for user logins. It first checks if a flood protection table exists in the database. If it does not exist, Drupal postpones the creation of the table until it needs to write to it instead of creating the missing table immediately. What can happen is...

Read more janez Wed, 04/03/2024 - 02:00
Categories: FLOSS Project Planets

Joey Hess: reflections on distrusting xz

Planet Debian - Wed, 2024-04-03 04:48

Was the ssh backdoor the only goal that "Jia Tan" was pursuing with their multi-year operation against xz?

I doubt it, and if not, then every fix so far has been incomplete, because everything is still running code written by that entity.

If we assume that they had a multilayered plan, that their every action was calculated and malicious, then we have to think about the full threat surface of using xz. This quickly gets into nightmare scenarios of the "trusting trust" variety.

What if xz contains a hidden buffer overflow or other vulnerability, that can be exploited by the xz file it's decompressing? This would let the attacker target other packages, as needed.

Let's say they want to target gcc. Well, gcc contains a lot of documentation, which includes png images. So they spend a while getting accepted as a documentation contributor on that project, and get added to it a png file that is specially constructed, it has additional binary data appended that exploits the buffer overflow. And instructs xz to modify the source code that comes later when decompressing gcc.tar.xz.

More likely, they wouldn't bother with an actual trusting trust attack on gcc, which would be a lot of work to get right. One problem with the ssh backdoor is that well, not all servers on the internet run ssh. (Or systemd.) So webservers seem a likely target of this kind of second stage attack. Apache's docs include png files, nginx does not, but there's always scope to add improved documentation to a project.

When would such a vulnerability have been introduced? In February, "Jia Tan" wrote a new decoder for xz. This added 1000+ lines of new C code across several commits. So much code and in just the right place to insert something like this. And why take on such a significant project just two months before inserting the ssh backdoor? "Jia Tan" was already fully accepted as maintainer, and doing lots of other work, it doesn't seem to me that they needed to start this rewrite as part of their cover.

They were working closely with xz's author Lasse Collin in this, by indications exchanging patches offlist as they developed it. So Lasse Collin's commits in this time period are also worth scrutiny, because they could have been influenced by "Jia Tan". One that caught my eye comes immediately afterwards: "prepares the code for alternative C versions and inline assembly" Multiple versions and assembly mean even more places to hide such a security hole.

I stress that I have not found such a security hole, I'm only considering what the worst case possibilities are. I think we need to fully consider them in order to decide how to fully wrap up this mess.

Whether such stealthy security holes have been introduced into xz by "Jia Tan" or not, there are definitely indications that the ssh backdoor was not the end of what they had planned.

For one thing, the "test file" based system they introduced was extensible. They could have been planning to add more test files later, that backdoored xz in further ways.

And then there's the matter of the disabling of the Landlock sandbox. This was not necessary for the ssh backdoor, because the sandbox is only used by the xz command, not by liblzma. So why did they potentially tip their hand by adding that rogue "." that disables the sandbox?

A sandbox would not prevent the kind of attack I discuss above, where xz is just modifying code that it decompresses. Disabling the sandbox suggests that they were going to make xz run arbitrary code, that perhaps wrote to files it shouldn't be touching, to install a backdoor in the system.

Both deb and rpm use xz compression, and with the sandbox disabled, whether they link with liblzma or run the xz command, a backdoored xz can write to any file on the system while dpkg or rpm is running and noone is likely to notice, because that's the kind of thing a package manager does.

My impression is that all of this was well planned and they were in it for the long haul. They had no reason to stop with backdooring ssh, except for the risk of additional exposure. But they decided to take that risk, with the sandbox disabling. So they planned to do more, and every commit by "Jia Tan", and really every commit that they could have influenced needs to be distrusted.

This is why I've suggested to Debian that they revert to an earlier version of xz. That would be my advice to anyone distributing xz.

I do have a xz-unscathed fork which I've carefully constructed to avoid all "Jia Tan" involved commits. It feels good to not need to worry about dpkg and tar. I only plan to maintain this fork minimally, eg security fixes. Hopefully Lasse Collin will consider these possibilities and address them in his response to the attack.

Categories: FLOSS Project Planets

Talking Drupal: Skills Upgrade #5

Planet Drupal - Wed, 2024-04-03 04:00

Welcome back to “Skills Upgrade” a Talking Drupal mini-series following the journey of a D7 developer learning D10. This is episode 5.

Topics
  • Review Chad's goals for the previous week

    • .gitignore
    • Field Example module
    • Plugin API
    • Drupaal 10 Masterclass book
  • Review Chad's questions

    • Field Example follow up
  • Tasks for the upcoming week

    • Examples module: js_example module
      • js_example.libraries.yml
      • hook_theme() implementation in js_example.module
      • JsExampleController
      • template files
Resources

.gitignore Drupal 10 Masterclass Modernizing Drupal 10 Theme Development Chad's Drupal 10 Learning Curriclum & Journal Chad's Drupal 10 Learning Notes

The Linux Foundation is offering a discount of 30% off e-learning courses, certifications and bundles with the code, all uppercase DRUPAL24 and that is good until June 5th https://training.linuxfoundation.org/certification-catalog/

Hosts

AmyJune Hineline - @volkswagenchick

Guests

Chad Hester - chadkhester.com @chadkhest Mike Anello - DrupalEasy.com @ultimike

Categories: FLOSS Project Planets

Kubuntu Brand Graphic Design Contest Deadline Extended!

Planet KDE - Wed, 2024-04-03 01:28

We’re thrilled to announce that due to the incredible engagement and enthusiasm from our community, the Kubuntu Council has decided to extend the submission deadline for the Kubuntu Brand Graphic Design Contest! Originally set to close at 23:59 on March 31, 2024, we’re giving you more time to unleash your creativity and submit your designs. The new deadline is now set for 23:59 on Saturday, 6th April 2024.

The decision comes in response to multiple requests from community members who are keen on participating but needed a bit more time to polish their submissions. We’ve been overwhelmed by the vibrant community response so far and are eager to see more of your innovative designs that embody the spirit of Kubuntu.

New Timeline:
  • Extended Submission Deadline: 23:59, Saturday, 6th April 2024
  • Review by Kubuntu Council: Sunday, 7th April 2024
  • Announcement of Results: Monday, 8th April 2024, at 20:00 UTC

The results will be announced via an update on the Kubuntu website, so stay tuned!

This extension is our way of saying thank you for the incredible effort and participation from the Kubuntu community. We want to ensure that everyone who wishes to contribute has the opportunity to do so. Whether you’re putting the finishing touches on your design or just getting started, now is your chance to be part of shaping the visual future of Kubuntu.

We look forward to seeing your creative concepts and designs. Let’s make this contest a showcase of the talent and passion that defines our community. Good luck to all participants, and thank you for making Kubuntu not just a powerful Linux distribution, but a vibrant community as well.

For contest details, submission guidelines, and to submit your work, visit the official Kubuntu Brand Graphic Design Contest page on our website. Let’s create something amazing together!

The Kubuntu Council

Additional Information may be found in the original Kubuntu Graphic Design Contest Post

Apply now: Contest Page

Categories: FLOSS Project Planets

Acquia Developer Portal Blog: Local environment for Acquia Site Factory

Planet Drupal - Tue, 2024-04-02 23:19

The purpose of this tutorial is to explain how to set up a best practice local environment for Drupal multisite, including on Acquia Site Factory:

  1. Configure Lando to support wildcard DNS.
  2. In settings.php, set an "App ID" to use in code for a specific multisite in any environment.
  3. In settings.php, set database, public files, memcache prefix, Solr core, and other settings per "App ID".
  4. Configure Drush aliases per multisite and per environment.
  5. Write bash scripts to push and pull sites using these aliases.
  1. Configure Lando with wildcard DNS

    Lando is a docker orchestration framework for development environments.

    There

Categories: FLOSS Project Planets

Salsa Digital: Splash Awards 2024

Planet Drupal - Tue, 2024-04-02 20:14
The categories The Splash Awards categories for 2024 were: Corporate Government (Federal) Government (State/local) Open source An overall winner Congratulations to all the winners, runner-ups and nominees! Salsa is proud to have three nominees ( GovCMS , Bendigo Courts and Drupal360 ) and one runner-up, wa.gov.au .
Categories: FLOSS Project Planets

Matt Layman: NATS: Connecting Apps Over a Network Easily

Planet Python - Tue, 2024-04-02 20:00
NATS is an awesome open source technology to help connect code together over a network. Whether you’re build a distributed microservice architecture or connecting IoT devices, NATS provides the tools you need to do that easily. In this talk, you’ll learn about NATS via a presentation with plenty of live coding examples.
Categories: FLOSS Project Planets

Arnaud Rebillout: Firefox: Moving from the Debian package to the Flatpak app (long-term?)

Planet Debian - Tue, 2024-04-02 20:00

First, thanks to Samuel Henrique for giving notice of recent Firefox CVEs in Debian testing/unstable.

At the time I didn't want to upgrade my system (Debian Sid) due to the ongoing t64 transition transition, so I decided I could install the Firefox Flatpak app instead, and why not stick to it long-term?

This blog post details all the steps, if ever others want to go the same road.

Flatpak Installation

Disclaimer: this section is hardly anything more than a copy/paste of the official documentation, and with time it will get outdated, so you'd better follow the official doc.

First thing first, let's install Flatpak:

$ sudo apt update $ sudo apt install flatpak

Then the next step is to add the Flathub remote repository, from where we'll get our Flatpak applications:

$ flatpak remote-add --if-not-exists flathub https://dl.flathub.org/repo/flathub.flatpakrepo

And that's all there is to it! Now come the optional steps.

For GNOME and KDE users, you might want to install a plugin for the software manager specific to your desktop, so that it can support and manage Flatpak apps:

$ which -s gnome-software && sudo apt install gnome-software-plugin-flatpak $ which -s plasma-discover && sudo apt install plasma-discover-backend-flatpak

And here's an additional check you can do, as it's something that did bite me in the past: missing xdg-portal-* packages, that are required for Flatpak applications to communicate with the desktop environment. Just to be sure, you can check the output of apt search '^xdg-desktop-portal' to see what's available, and compare with the output of dpkg -l | grep xdg-desktop-portal.

As you can see, if you're a GNOME or KDE user, there's a portal backend for you, and it should be installed. For reference, this is what I have on my GNOME desktop at the moment:

$ dpkg -l | grep xdg-desktop-portal | awk '{print $2}' xdg-desktop-portal xdg-desktop-portal-gnome xdg-desktop-portal-gtk Install the Firefox Flatpak app

This is trivial, but still, there's a question I've always asked myself: should I install applications system-wide (aka. flatpak --system, the default) or per-user (aka. flatpak --user)? Turns out, this questions is answered in the Flatpak documentation:

Flatpak commands are run system-wide by default. If you are installing applications for day-to-day usage, it is recommended to stick with this default behavior.

Armed with this new knowledge, let's install the Firefox app:

$ flatpak install flathub org.mozilla.firefox

And that's about it! We can give it a go already:

$ flatpak run org.mozilla.firefox Data migration

At this point, running Firefox via Flatpak gives me an "empty" Firefox. That's not what I want, instead I want my usual Firefox, with a gazillion of tabs already opened, a few extensions, bookmarks and so on.

As it turns out, Mozilla provides a brief doc for data migration, and it's as simple as moving Firefox data directory around!

To clarify, we'll be copying data:

  • from ~/.mozilla/ -- where the Firefox Debian package stores its data
  • into ~/.var/app/org.mozilla.firefox/.mozilla/ -- where the Firefox Flatpak app stores its data

Make sure that all Firefox instances are closed, then proceed:

# BEWARE! Below I'm erasing data! $ rm -fr ~/.var/app/org.mozilla.firefox/.mozilla/firefox/ $ cp -a ~/.mozilla/firefox/ ~/.var/app/org.mozilla.firefox/.mozilla/

To avoid confusing myself, it's also a good idea to rename the local data directory:

$ mv ~/.mozilla/firefox ~/.mozilla/firefox.old.$(date --iso-8601=date)

At this point, flatpak run org.mozilla.firefox takes me to my "usual" everyday Firefox, with all its tabs opened, pinned, bookmarked, etc.

More integration?

After following all the steps above, I must say that I'm 99% happy. So far, everything works as before, I didn't hit any issue, and I don't even notice that Firefox is running via Flatpak, it's completely transparent.

So where's the 1% of unhappiness? The « Run a Command » dialog from GNOME, the one that shows up via the keyboard shortcut <Alt+F2>. This is how I start my GUI applications, and I usually run two Firefox instances in parallel (one for work, one for personal), using the firefox -p <profile> command.

Given that I ran apt purge firefox before (to avoid confusing myself with two installations of Firefox), now the right (and only) way to start Firefox from a command-line is to type flatpak run org.mozilla.firefox -p <profile>. Typing that every time is way too cumbersome, so I need something quicker.

Seems like the most straightforward is to create a wrapper script:

$ cat /usr/local/bin/firefox #!/bin/sh exec flatpak run org.mozilla.firefox "$@"

And now I can just hit <Alt+F2> and type firefox -p <profile> to start Firefox with the profile I want, just as before. Neat!

Looking forward: system updates

I usually update my system manually every now and then, via the well-known pair of commands:

$ sudo apt update $ sudo apt full-upgrade

The downside of introducing Flatpak, ie. introducing another package manager, is that I'll need to learn new commands to update the software that comes via this channel.

Fortunately, there's really not much to learn. From flatpak-update(1):

flatpak update [OPTION...] [REF...]

Updates applications and runtimes. [...] If no REF is given, everything is updated, as well as appstream info for all remotes.

Could it be that simple? Apparently yes, the Flatpak equivalent of the two apt commands above is just:

$ flatpak update

Going forward, my options are:

  1. Teach myself to run flatpak update additionally to apt update, manually, everytime I update my system.
  2. Go crazy: let something automatically update my Flatpak apps, in my back and without my consent.

I'm actually tempted to go for option 2 here, and I wonder if GNOME Software will do that for me, provided that I installed gnome-software-plugin-flatpak, and that I checked « Software Updates -> Automatic » in the Settings (which I did).

However, I didn't find any documentation regarding what this setting really does, so I can't say if it will only download updates, or if it will also install it. I'd be happy if it automatically installs new version of Flatpak apps, but at the same time I'd be very unhappy if it automatically upgrades my Debian system...

So we'll see. Enough for today, hope this blog post was useful!

Categories: FLOSS Project Planets

Dirk Eddelbuettel: ulid 0.3.1 on CRAN: New Maintainer, Some Polish

Planet Debian - Tue, 2024-04-02 19:14

Happy to share that ulid is now (back) on CRAN. It provides universally unique identifiers that are lexicographically sortable, which improves over the more well-known uuid generators.

ulid is a neat little package put together by Bob Rudis a few years ago. It had recently drifted off CRAN so I offered to brush it up and re-submit it. And as tooted earlier today, it took just over an hour to finish that (after the lead up work I had done, including prior email with CRAN in the loop, the repo transfer from Bob’s to my ulid repo plus of course a wee bit of actual maintenance; see below for more).

The NEWS entry follows.

Changes in version 0.3.1 (2024-04-02)
  • New Maintainer

  • Deleted several repository files no longer used or needed

  • Added .editorconfig, ChangeLog and cleanup

  • Converted NEWS.md to NEWS.Rd

  • Simplified R/ directory to one source file

  • Simplified src/ removing redundant Makevars

  • Added ulid() alias

  • Updated / edited roxygen and README.md documention

  • Removed vignette which was identical to README.md

  • Switched continuous integration to GitHub Actions

  • Placed upstream (header-only) library into src/ulid/

  • Renamed single interface file to src/wrapper

If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Categories: FLOSS Project Planets

The Drop Times: For Drupal to Remain Well and Alive: An Exclusive Conversation with Tim Doyle

Planet Drupal - Tue, 2024-04-02 17:09
Discover the future of Drupal and the open-source community in our exclusive interview with Tim Doyle, CEO of the Drupal Association. Learn about the innovative Open Web Alliance, Drupal's journey as a Digital Public Good, and the strategic plans shaping the platform's future. Dive into Tim's vision for a collaborative, inclusive, and technologically advanced Drupal ecosystem that champions the open web. Don't miss these insightful revelations and strategies set to redefine the landscape of open-source content management systems.
Categories: FLOSS Project Planets

Drupal Association blog: DrupalCon Portland 2024: The Nonprofit Summit Agenda is here!

Planet Drupal - Tue, 2024-04-02 15:43

I am pleased to share the schedule for the upcoming 2024 DrupalCon Nonprofit Summit. There is a special rate ($395.00) for nonprofit org staff, and those who are affiliated with nonprofits, and the summit is included free with your ticket! You can register here.

Relying on community feedback and past experience, we put together an agenda that we hope encompasses the spirit of open source camaraderie and will provide nourishment for the mind and soul. We tried to balance the technical with the strategy and networking with expertise. We look forward to seeing you there.

Agenda 9:00 am - 9:15 am: Welcome and overview

Julia Kranzthor

9:15 am - 10:30 am: Why Should Nonprofits Use Drupal? The Case for Owning Your Own Data and Using Drupal to Manage It.

Fireside Chat with Tim Lehnen, Johanna Bates, and Jess Snyder

10:30 am - 10:45 am: Break 10:45 am -11:00 am: Sponsor Case Study #1 11:00 am - 12:15 pm: Breakout Sessions

Round Table Discussions

  • Using Drupal to Promote Engagement With Your Audience: Tools, Challenges, and Measurement

  • Web Analytics for Nonprofits: Google Analytics 4 and Alternatives.

  • Thriving as a Lone Wolf: Navigating the Challenges of Being the Only Drupalist at a Nonprofit

  • Migrating from Drupal 7 to Drupal 10

  • Managing a Major Website Rebuild/Migration

  • Birds of a Feather: Topic to be determined on-site

12:15 pm - 1:15 pm: Lunch

A time for relaxing, and networking if you feel like it.

1:15 pm - 2:30 pm: Breakout Sessions

Round Table Discussions

  • Web Accessibility and Site Governance

  • Using Drupal in Small Nonprofits with Limited Staff and Financial Resources

  • Preparing for Impact on Your Website Redesign

  • Development and Hosting Challenges for Nonprofits

  • Leveraging CiviCRM with Drupal: Open Source CRM for Contact Management and Engagement Tracking

  • Birds of a Feather: Topic to be determined on-site

2:30 pm - 2:45 pm: Sponsor Case Study #2 2:45 pm - 3:00 pm: Break 3:00 pm - 4:00 pm: Drupal 10 Migration: How to Stop Kicking the Can (of Worms) Down the Road

Panel discussion with Tim Lehnen and Fran Garcia-Linares

4:00 pm - 5:00 pm: Optional Networking

Wrap up conversations, visit with colleagues.

Categories: FLOSS Project Planets

PyCoder’s Weekly: Issue #623 (April 2, 2024)

Planet Python - Tue, 2024-04-02 15:30

#623 – APRIL 2, 2024
View in Browser »

Reading and Writing WAV Files in Python

In this tutorial, you’ll learn how to work with WAV audio files in Python using the standard-library wave module. Along the way, you’ll synthesize sounds from scratch, visualize waveforms in the time domain, animate real-time spectrograms, and apply special effects to widen the stereo field.
REAL PYTHON

Designing a Pure Python Web Framework

This blog post talks about Reflex, a Python web framework. The post talks about what makes Reflex different from other frameworks and shows you sample starting code. See also the associated HN Discussion.
NIKHIL RAO

Creating an Autopilot in X-Plane Using Python

X-Plane is a flight simulator, and Austin is using Python to create an autopilot using proportional integral derivative controllers. Read on to see how its done.
AUSTIN

Mojo Goes Open Source

MODULAR

PyPI Hiring a Support Specialist (Remote)

PYPI

Discussions Draft PEP: Sealed Decorator for Static Typing

PYTHON DISCUSS

What Are Some Good Python Codebases to Read?

LOBSTERS

Articles & Tutorials Using Python in Bioinformatics and the Laboratory

How is Python being used to automate processes in the laboratory? How can it speed up scientific work with DNA sequencing? This week on the show, Chemical Engineering PhD Student Parsa Ghadermazi is here to discuss Python in bioinformatics.
REAL PYTHON podcast

Handling Database Migrations With Alembic

Alembic is a change control tool for database content in SQLAlchemy. This article looks at the high-level architecture of how Alembic works, how to add it to your project, and some common workflows you’ll encounter.
PAUL ESCH-LAURENT • Shared by Michael Herman

Python Tricks: A Buffet of Awesome Python Features

Discover Python’s best practices with simple examples and start writing even more beautiful + Pythonic code. “Python Tricks: The Book” shows you exactly how. You’ll master intermediate and advanced-level features in Python with practical examples and a clear narrative. Get the book + video bundle 33% off →
DAN BADER sponsor

Python in List of Best Languages to Learn

The US Bureau of Labor Statistics has identified the top four languages for programmers to learn and Python made the list. Median annual wage of programmers in the US is expected to rise 25% in the next 5 years.
FORTUNE

Finding Python Easter Eggs

Python has its fair share of hidden surprises, commonly known as Easter eggs. From clever jokes to secret messages, these little mysteries are often meant to be discovered by curious geeks like you!
REAL PYTHON course

PyPI Temporarily Halted New Users and Projects

To fend off a supply-chain attack, PyPI temporarily halted new users and projects for about 10 hours last week. This article discusses why, and the scourge of supply-chain attacks.
ARS TECHNICA

Broadcasting in NumPy

Broadcasting in NumPy is not the most exciting topic, but this article explores the topic using a narrative perspective. This is not your standard “broadcasting in NumPy” article!
STEPHEN GRUPPETTA • Shared by Stephen Gruppetta

A Better Python Cache for Slow Function Calls

The folks at Sweep AI needed something more persistent than Python’s lru_cache. This post talks about the design behind a file based cached decorator they’ve recently released.
WILLIAM ZENG

Jupyter & IPython Terminology Explained

Are you trying to understand the differences between Jupyter Notebook, JupyterLab, IPython, Colab, and related terms? This article is for you.
DATA SCHOOL

How I Manage Python in 2024

This post covers the tools one developer uses in their day-to-day process. Read on for info about mise, uv, ruff, and more.
OUTLORE

Fixing a Bug in PyPy’s Incremental GC

A deep dive on hunting a tricky bug in the garbage collection code inside the alternate interpreter PyPy.
CARL FRIEDRICH BOLZ-TEREICK

Projects & Code django-prose-editor: Rich Text Editing for Django

GITHUB.COM/MATTHIASK

pycountry: ISO Country, Language, Currency and More

GITHUB.COM/PYCOUNTRY

sqlelf: Explore ELF Objects Through the Power of SQL

GITHUB.COM/FZAKARIA

Python Post-Mortem Debugger

GITHUB.COM/COCOLATO • Shared by cocolato

botasaurus: Framework to Build Awesome Scrapers

GITHUB.COM/OMKARCLOUD

Events Weekly Real Python Office Hours Q&A (Virtual)

April 3, 2024
REALPYTHON.COM

Canberra Python Meetup

April 4, 2024
MEETUP.COM

Sydney Python User Group (SyPy)

April 4, 2024
SYPY.ORG

PyCascades 2024

April 5 to April 9, 2024
PYCASCADES.COM

PyDelhi User Group Meetup

April 6, 2024
MEETUP.COM

Django Girls Ecuador 2024

April 6, 2024
OPENLAB.EC

Happy Pythoning!
This was PyCoder’s Weekly Issue #623.
View in Browser »

[ Subscribe to 🐍 PyCoder’s Weekly 💌 – Get the best Python news, articles, and tutorials delivered to your inbox once a week >> Click here to learn more ]

Categories: FLOSS Project Planets

Sven Hoexter: PKIX: pathLen Constrain on Root Certificates

Planet Debian - Tue, 2024-04-02 15:07

I recently came a cross a x509 P(rivate)KI Root Certificate which had a pathLen constrain set on the (self signed) Root Certificate. Since that is not commonly seen I looked a bit around to get a better understanding about how the pathLen basic constrain should be used.

Primary source is RFC 5280 section 4.2.1.9

The pathLenConstraint field is meaningful only if the cA boolean is asserted and the key usage extension, if present, asserts the keyCertSign bit (Section 4.2.1.3). In this case, it gives the maximum number of non-self-issued intermediate certificates that may follow this certificate in a valid certification path

Since the Root is always self-issued it doesn't count towards the limit, and since it's the last certificate (or the first depending on how you count) in a chain, it's pretty much pointless to configure a pathLen constrain directly on a Root Certificate.

Another relevant resource are the Baseline Requirements of the CA/Browser Forum (currently v2.0.2). Section 7.1.2.1.4 "Root CA Basic Constraints" describes it as NOT RECOMMENDED for a Root CA.

Last but not least there is the awesome x509 Limbo project which has a section for validating pathLen constrains. Since the RFC 5280 based assumption is that self signed certs do not count, they do not check a case with such a constrain on the Root itself, and what the implementations do about it. So the assumption right now is that they properly ignore it.

Summary: It's pointless to set the pathLen constrain on the Root Certificate, so just don't do it.

Categories: FLOSS Project Planets

PyCon: PyCon US 2024: Call for Volunteers and Hatchery Registration now Open!

Planet Python - Tue, 2024-04-02 14:45

Looking to make a meaningful contribution to the Python community? Look no further than PyCon US 2024! Whether you're a seasoned Python pro or a newcomer to the community and looking to get involved, there's a volunteer opportunity that's perfect for you.

Sign-up for volunteer roles is done directly through the PyCon US website. This way, you can view and manage shifts you sign up for through your personal dashboard! You can read up on the different roles to volunteer for and how to sign up on the PyCon US website.

PyCon US is largely organized and run by volunteers. Every year, we ask to fill over 300 onsite volunteer hours to ensure everything runs smoothly at the event. And the best part? You don't need to commit a lot of time to make a difference– some shifts are as short as one hour long! You can sign up for as many or as few shifts as you’d like. Even a couple of hours of your time can go a long way in helping us create an amazing experience for attendees.

Keep in mind that you need to be registered for the event to sign up for a volunteer role.

One important way to get involved is to sign up as a Session Chair or Session Runner. This is an excellent opportunity to meet and interact with speakers while helping to ensure that sessions run smoothly. And who knows, you might just learn something new along the way! You can sign up for these roles directly on the Talks schedule.

Volunteer your time at PyCon US 2024 and you’ll be part of a fantastic community that's passionate about Python programming and help us make this year's conference a huge success. Sign up today for the shifts that call to you and join the fun!

Hatchery Program

First introduced in 2018, the Hatchery program offers the pathways for PyCon US attendees to introduce new tracks, activities, summits, demos, etc., at the conference—activities that all share and fulfill the Python Software Foundation’s mission within the PyCon US schedule.

Since its introduction, this program has “hatched” several new tracks that are now staples of our conference, including PyCon US Charlas, Mentored Sprints, and the Maintainer’s Summit. This year, we’ve received eight very compelling proposals. After careful consideration, we have selected four new programs, each of them unique and focus on different aspects of the Python community.

FlaskCon - Friday, May 17, 2024

Join us in a mini conference dedicated to Flask, its community and ecosystem, as well as related web technologies. Meet maintainers and community members, learn about how to get involved, and join us during the sprint days to contribute. Submit your talk proposal today!

Organized by David Lord, Phil Jones, Adam Englander, David Carmichael, Abdur-Rahmaan Janhangeer

Community Organizers Summit - Saturday, May 18, 2024

Do you organize a Conference, Meetup, User Group, Hackathon, or other community event in your area? Are you trying to start a group but don't know where to start? Whether you have 30 years of experience or are looking to create a new event, this summit is for you.

Join us for a summit of Presentations, Panels, and Breakout Sessions about organizing community events.

Organized by Mason Egger, Kevin Horn, and Heather White

Sign-up is required. Register to secure your spot.

Humble Data - Saturday, May 18, 2024

Are you eager to embark on a tech career but unsure where to start? Are you curious about data science? Taking the first steps in this area is hard, but you don’t have to do it alone. Join our workshop for complete beginners and get started in Python data science - even if you’ve never written a single line of code!

We invite those from underrepresented groups to apply to join us for a fun, supportive workshop that will give you the confidence to get started in this exciting area. You can expect plenty of exercises, as well as inspiring talks from those who were once in your shoes. You’ll cover the basics of programming in Python, as well as useful libraries and tools such as Jupyter notebooks, pandas, and Matplotlib.

In this hands-on workshop, you’ll work through a series of beginner-friendly materials at your own pace. You’ll work within small groups, each with an assigned mentor, who will be there to help you with any questions or whenever you get stuck. All you’ll need to bring is a laptop that can connect to the internet and a willingness to learn!

Organized by Cheuk Ting Ho and Jodie Burchell

Sign-up is required. Register to secure your spot.
Documentation Summit - Sunday May 19, 2024

A full-day summit including talks and panel sessions inviting leaders in documentation to share their experience in how to make good documentation, discussion about documentation tools such as sphinx, mkdocs, themes etc, what are the common mistakes and how to avoid them. Accessibility of documentation is also an important topic so we will also cover talks or discussions regarding accessibility of documentation.

This summit is aimed at anyone who cares about or is involved in any aspect of open source documentation, such as, but not limited to, technical writers, developers, developer advocates, project maintainers and contributors, accessibility experts, documentation tooling developers, and documentation end-users.

Organized by Cheuk Ting Ho

Sign-up is required. Register to secure your spot.
Register Now

Registration for PyCon US is now open, and all of the Hatchery programs are included as part of your PyCon US registration (no additional cost). Some of the programs require advanced sign up, in which case walk-ins will only be accepted if space is available. Please check each Hatchery program carefully to determine whether a registration is required or not.

Head over to your PyCon US Dashboard to add any of the above Hatchery programs to your PyCon US registration. Don't worry, you can always change your mind and cancel later to open up the space for someone else!

Congratulations to all the accepted program organizers! Thank you for bringing forward your fresh ideas to PyCon US. We look forward to seeing you in Pittsburgh.
Categories: FLOSS Project Planets

Pages