Feeds
Freexian Collaborators: Debian Contributions: OpenMPI transitions, cPython 3.12.7+ update uploads, Python 3.13 Transition, and more! (by Anupa Ann Joseph, Stefano Rivera)
Contributing to Debian is part of Freexian’s mission. This article covers the latest achievements of Freexian and their collaborators. All of this is made possible by organizations subscribing to our Long Term Support contracts and consulting services.
Transition management, by Emilio Pozuelo MonfortEmilio has been helping finish the mpi-defaults switch to mpich on 32-bit architectures, and the openmpi transitions. This involves filing bugs for the reverse dependencies, doing NMUs, and requesting removals for outdated (Not Built from Source) binaries on 32-bit architectures where openmpi is no longer available. Those transitions got entangled with a few others, such as the petsc stack, and were blocking many packages from migrating to testing. These transitions were completed in early December.
cPython 3.12.7+ update uploads, by Stefano RiveraPython 3.12 had failed to build on mips64el, due to an obscure dh_strip failure. The mips64el porters never figured it out, but the missing build on mips64el was blocking migration to Debian testing. After waiting a month, enough changes had accumulated in the upstream 3.12 maintenance git branch that we could apply them in the hope of changing the output enough to avoid breaking dh_strip. This worked.
Of course there were other things to deal with too. A test started failing due to a Debian-specific patch we carry for python3.x-minimal, and it needed to be reworked. And Stefano forgot to strip the trailing + from PY_VERSION, which confuses some python libraries. This always requires another patch when applying git updates from the maintenance branch. Stefano added a build-time check to catch this mistake in the future. Python 3.12.7 migrated.
Python 3.13 Transition, by Stefano Rivera and Colin WatsonDuring November the Python 3.13-add transition started. This is the first stage of supporting a new version of Python in Debian archive (after preparatory work), adding it as a new supported but non-default version. All packages with compiled Python extensions need to be re-built to add support for the new version.
We have covered the lead-up to this transition in the past. Due to preparation, many of the failures we hit were expected and we had patches waiting in the bug tracker. These could be NMUed to get the transition moving. Others had been known about but hadn’t been worked on, yet.
Some other packages ran into new issues, as we got further into the transition than we’d been able to in preparation. The whole Debian Python team has been helping with this work.
The rebuild stage of the 3.13-add transition is now over, but many packages need work before britney will let python3-defaults migrate to testing.
Limiting build concurrency based on available RAM, by Helmut GrohneIn recent years, the concurrency of CPUs has been increasing as has the demand for RAM by linkers. What has not been increasing as quickly is the RAM supply in typical machines. As a result, we more frequently run into situations where the package builds exhaust memory when building at full concurrency. Helmut initiated a discussion about generalizing an approach to this in Debian packages. Researching existing code that limits concurrency as well as providing possible extensions to debhelper and dpkg to provide concurrency limits based on available system RAM. Thus far there is consensus on the need for a more general solution, but ideas are still being collected for the precise solution.
MiniDebConf Toulouse at Capitole du LibreThe whole Freexian Collaborator team attended MiniDebConf Toulouse, part of the Capitole du Libre event. Several members of the team gave talks:
- Santiago spoke on Linux Live Patching in Debian, presenting an update on the idea since DebConf 24. This includes the initial requirements for the livepatch package format, that would be used to distribute the livepatches.
- Stefano, Colin, Enrico, and Carles spoke on Using Debusine to Automate QA.
- Santiago and Roberto spoke on How LTS Goes Beyond LTS.
- Helmut spoke on Cross Building.
- Carles gave a lightning talk on po-debconf-manager.
Stefano and Anupa worked as part of the video team, streaming and recording the event’s talks.
Miscellaneous contributions-
Stefano looked into packaging the latest upstream python-falcon version in Debian, in support of the Python 3.13 transition. This appeared to break python-hug, which is sadly looking neglected upstream, and the best course of action is probably its removal from Debian.
-
Stefano uploaded videos from various 2024 Debian events to PeerTube and YouTube.
-
Stefano and Santiago visited the site for DebConf 2025 in Brest, after the MiniDebConf in Toulouse, to meet with the local team and scout out the venue.
The on-going DebConf 25 organization work of last month also included handling the logo and artwork call for proposals.
-
Stefano helped the press team to edit a post for bits.debian.org on OpenStreetMap’s migration to Debian.
-
Carles implemented multiple language support on po-debconf-manager and tested it using Portuguese-Brazilian during MiniDebConf Toulouse. The system was also tested and improved by reviewing more than 20 translations to Catalan, creating merge requests for those packages, and providing user support to new users. Additionally, Carles implemented better status transitions, configuration keys management and other small improvements.
-
Helmut sent 32 patches for cross build failures. The wireplumber one was an interactive collaboration with Dylan Aïssi.
-
Helmut continued to monitor the /usr-move, sent a patch for lib64readline8 and continued several older patch conversations. lintian now reports some aliasing issues in unstable.
-
Helmut initiated a discussion on the semantics of *-for-host packages. More feedback is welcome.
-
Helmut improved the crossqa.debian.net infrastructure to fail running lintian less often in larger packages.
-
Helmut continued maintaining rebootstrap mostly dropping applied patches and continuing discussions of submitted patches.
-
Helmut prepared a non-maintainer upload of gzip for several long-standing bugs.
-
Colin came up with a plan for resolving the multipart vs. python-multipart name conflict, and began work on converting reverse-dependencies.
-
Colin upgraded 42 Python packages to new upstream versions. Some were complex: python-catalogue had some upstream version confusion, pydantic and rpds-py involved several Rust package upgrades as prerequisites, and python-urllib3 involved first packaging python-quart-trio and then vendoring an unpackaged test-dependency.
-
Colin contributed Incus support to needrestart upstream.
-
Lucas set up a machine to do a rebuild of all ruby reverse dependencies to check what will be broken by adding ruby 3.3 as an alternative interpreter. The tool used for this is mass-rebuild and the initial rebuilds have already started. The ruby interpreter maintainers are planning to experiment with debusine next time.
-
Lucas is organizing a Debian Ruby sprint towards the end of January in Paris. The plan of the team is to finish any missing bits of Ruby 3.3 transition at the time, try to push Rails 7 transition and fix RC bugs affecting the ruby ecosystem in Debian.
-
Anupa attended a Debian Publicity team meeting in-person during MiniDebCamp Toulouse.
-
Anupa moderated and posted in the Debian Administrator group in LinkedIn.
Seth Michael Larson: New experimental Debian package for Cosign (Sigstore)
Published 2024-12-09 by Seth Larson
Reading time: minutes
Cosign has a new experimental package available for Debian thanks to the work of Simon Josefsson. Simon and I had an email exchange about Sigstore and Cosign on Debian after the discussion about PEP 761 (Deprecation and discontinuation of PGP signatures).
Debian and other downstream distros of Python and Python packages are incredibly important consumers of verification materials. Because these distros actually verify materials for every build of a package, this increases the confidence for other users using these same artifacts even without those users directly verifying the materials themselves. We need more actors in the ecosystem doing end-to-end verification to dissuade attackers from supply-chain attacks targeting artifact repositories like python.org and PyPI.
Trying Cosign in DockerI gave the experimental package a try using the Debian Docker image to verify CPython 3.14.0-alpha2's tarball and verification materials:
$ docker run --rm -it debian:bookworm # Install the basics for later use. apt-get install ca-certificates wget # Add Simon's experimental package repo # and install Cosign! :party: $ echo "deb [trusted=yes] https://salsa.debian.org/jas/cosign/-/jobs/6682245/artifacts/raw/aptly experimental main" | \ tee --append /etc/apt/sources.list.d/add.list $ apt-get update $ apt-get install cosign $ cosign version ______ ______ _______. __ _______ .__ __. / | / __ \ / || | / _____|| \ | | | ,----'| | | | | (----`| | | | __ | \| | | | | | | | \ \ | | | | |_ | | . ` | | `----.| `--' | .----) | | | | |__| | | |\ | \______| \______/ |_______/ |__| \______| |__| \__| cosign: A tool for Container Signing, Verification and Storage in an OCI registry.Now we can test Cosign out with CPython's artifacts. We expect Hugo van Kemenade (hugo@python.org) as the release manager for Python 3.14:
# Download the source and Sigstore bundle $ wget https://www.python.org/ftp/python/3.14.0/Python-3.14.0a2.tgz $ wget https://www.python.org/ftp/python/3.14.0/Python-3.14.0a2.tgz.sigstore # Verify with Cosign! $ cosign verify-blob \ --certificate-identity hugo@python.org \ --certificate-oidc-issuer https://github.com/login/oauth \ --bundle ./Python-3.14.0a2.tgz.sigstore \ --new-bundle-format \ ./Python-3.14.0a2.tgz Verified OKOverall, this is working as expected from my point-of-view! Simon had a few open questions mostly for Cosign's upstream project. I am hopeful that this means we'll begin seeing Sigstore bundles and their derivatives (such as attestations from the Python Package Index) be used for downstream verification by distros like Debian. Exciting times ahead!
New Bundle FormatMy first attempt didn't include the --new-bundle-format option and that resulted in an opaque error. Hopefully this user-experience issue will be phased out and Cosign will "default" to the new bundle format? I included this error strictly for folks searching for this error message and wanting to fix their issue.
Error: bundle does not contain cert for verification, please provide public key main.go:74: error during command execution: bundle does not contain cert for verification, please provide public key This critical role would not be possible without funding from the Alpha-Omega project.Have thoughts or questions? Let's chat over email or social:
sethmichaellarson@gmail.com
@sethmlarson@fosstodon.org
Want more articles like this one? Get notified of new posts by subscribing to the RSS feed or the email newsletter. I won't share your email or send spam, only whatever this is!
Want more content now? This blog's archive has ready-to-read articles. I also curate a list of cool URLs I find on the internet.
Find a typo? This blog is open source, pull requests are appreciated.
Thanks for reading! ♡ This work is licensed under CC BY-SA 4.0
︎Python⇒Speed: Reducing CO₂ emissions with faster software
What can you as a software developer do to fight climate change? My first and primary answer is getting involved with local politics. However, if you write software that operates at sufficient scale, you can also reduce carbon emissions by making your software faster.
In this article we’ll cover:
- Why more computation uses more electricity.
- Why you probably don’t need to think about this most of the time.
- Reducing emissions by reducing compute time.
- Reducing emissions with parallelism (even with the same amount of compute time!).
- Some alternative scenarios and caveats: embodied emissions and Jevons Paradox.
GNUnet News: GNUnet 0.23.0
We are pleased to announce the release of GNUnet 0.23.0.
GNUnet is an alternative network stack for building secure, decentralized and
privacy-preserving distributed applications.
Our goal is to replace the old insecure Internet protocol stack.
Starting from an application for secure publication of files, it has grown to
include all kinds of basic protocol components and applications towards the
creation of a GNU internet.
This is a new major release. It breaks protocol compatibility with the 0.22.0X versions. Please be aware that Git master is thus henceforth (and has been for a while) INCOMPATIBLE with the 0.22.0X GNUnet network, and interactions between old and new peers will result in issues. In terms of usability, users should be aware that there are still a number of known open issues in particular with respect to ease of use, but also some critical privacy issues especially for mobile users. Also, the nascent network is tiny and thus unlikely to provide good anonymity or extensive amounts of interesting information. As a result, the 0.23.0 release is still only suitable for early adopters with some reasonable pain tolerance .
Download links- gnunet-0.23.0.tar.gz ( signature )
- gnunet-0.23.0-meson.tar.xz ( signature ) NEW: Test tarball made using the meson build system.
- gnunet-gtk-0.23.0.tar.gz ( signature )
- gnunet-fuse-0.23.0.tar.gz ( signature )
The GPG key used to sign is: 3D11063C10F98D14BD24D1470B0998EF86F59B6A
Note that due to mirror synchronization, not all links might be functional early after the release. For direct access try http://ftp.gnu.org/gnu/gnunet/
ChangesA detailed list of changes can be found in the git log , the NEWS and the bug tracker . Noteworthy highlights are
- Code review: A number of issues found during a code review have been addressed.
- util : A GNUNET_OS_ProjectData must now be passed to some APIs that are commonly used by third parties using libgnunetutil (e.g. Taler, GNUnet-Gtk) as to properly handle cases where the GNUnet installation directory is different from the third-party directory.
- Build System : Improved build times by outsourcing handbook to prebuilt files and only generating GANA source files manually.
- There are known major design issues in the CORE subsystems which will need to be addressed in the future to achieve acceptable usability, performance and security.
- There are known moderate implementation limitations in CADET that negatively impact performance.
- There are known moderate design issues in FS that also impact usability and performance.
- There are minor implementation limitations in SET that create unnecessary attack surface for availability.
- The RPS subsystem remains experimental.
In addition to this list, you may also want to consult our bug tracker at bugs.gnunet.org which lists about 190 more specific issues.
ThanksThis release was the work of many people. The following people contributed code and were thus easily identified: Christian Grothoff, TheJackiMonster, oec, ch3, and Martin Schanzenbach.
Dirk Eddelbuettel: pinp 0.0.11 on CRAN: Maintenance
A new version of our pinp package arrived on CRAN today, and is the first release in four years. The pinp package allows for snazzier one or two column Markdown-based pdf vignettes, and is now used by a few packages. A screenshot of the package vignette can be seen below. Additional screenshots are at the pinp page.
This release contains no new features or new user-facing changes but reflects the standard package and repository maintenance over the four-year window since the last release: updating of actions, updating of URLs and addressing small packaging changes spotted by ever-more-vigilant R checking code.
The NEWS entry for this release follows.
Changes in pinp version 0.0.11 (2024-12-08)Standard package maintenance for continuous integration, URL updates, and packaging conventions
Correct two minor nags in the Rd file
Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the ping page. For questions or comments use the issue tracker off the GitHub repo. If you like this or other open-source work I do, you can sponsor me at GitHub.
This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.
Design System December Updates
Hey team!
Back with a series of updates on the Plasma Design System work that we are doing. All videos contain English captions.
Leave your feedback or let us know if you have any questions.
Dries Buytaert: Drupal CMS: the official name for Drupal Starshot
We're excited to announce that "Drupal CMS" will be the official name for the product developed by the Drupal Starshot Initiative.
The name "Drupal CMS" was chosen after user testing with both newcomers and experienced users. This name consistently scored highest across all tested groups, including marketers unfamiliar with Drupal.
Participants appreciated the clarity it brings:
Having the words CMS in the name will make it clear what the product is. People would know that Drupal was a content management system by the nature of its name, rather than having to ask what Drupal is. I'm a designer familiar with the industry, so the term CMS or content management system is the term or phrase that describes this product most accurately in my opinion. I think it is important to have CMS in the title.The name "Drupal Starshot" will remain an internal code name until the first release of Drupal CMS, after which it will most likely be retired.
Freelock Blog: Show a mix of future and past events
Another automation we did for Programming Librarian, a site for librarians to plan educational programs, involved events. They wanted to always feature 3 events on the home page, and the most important events were in the future.
Real Python: How to Run Your Python Scripts and Code
Running a Python script is a fundamental task for any Python developer. You can execute a Python .py file through various methods depending on your environment and platform. On Windows, Linux, and macOS, use the command line by typing python script_name.py to run your script. You can also use the python command with the -m option to execute modules. This tutorial covers these methods and more, ensuring you can run Python scripts efficiently.
By the end of this tutorial, you’ll understand that:
- Running a Python .py script involves using the python command followed by the script’s filename in the terminal or command prompt.
- Running Python from the command prompt requires you to open the command prompt, navigate to the script’s directory, and execute it using python script_name.py.
- Running a .py file in Windows can be done directly from the command prompt or by double-clicking the file if Python is associated with .py files.
- Running a Python script without Python installed is possible by using online interpreters or converting scripts to executables, but it’s more flexible to install Python and run scripts natively.
To get the most out of this tutorial, you should know the basics of working with your operating system’s terminal and file manager. It’d also be beneficial for you to be familiar with a Python-friendly IDE or code editor and with the standard Python REPL (Read-Eval-Print Loop).
Free Download: Get a sample chapter from Python Tricks: The Book that shows you Python’s best practices with simple examples you can apply instantly to write more beautiful + Pythonic code.
Take the Quiz: Test your knowledge with our interactive “How to Run Your Python Scripts” quiz. You’ll receive a score upon completion to help you track your learning progress:
Interactive Quiz
How to Run Your Python ScriptsOne of the most important skills you need to build as a Python developer is to be able to run Python scripts and code. Test your understanding on how good you are with running your code.
What Scripts and Modules AreIn computing, the term script refers to a text file containing a logical sequence of orders that you can run to accomplish a specific task. These orders are typically expressed in a scripting language, which is a programming language that allows you to manipulate, customize, and automate tasks.
Scripting languages are usually interpreted at runtime rather than compiled. So, scripts are typically run by some kind of interpreter, which is responsible for executing each order in a sequence.
Python is an interpreted language. Because of that, Python programs are commonly called scripts. However, this terminology isn’t completely accurate because Python programs can be way more complex than a simple, sequential script.
In general, a file containing executable Python code is called a script—or an entry-point script in more complex applications—which is a common term for a top-level program. On the other hand, a file containing Python code that’s designed to be imported and used from another Python file is called a module.
So, the main difference between a module and a script is that modules store importable code while scripts hold executable code.
Note: Importable code is code that defines something but doesn’t perform a specific action. Some examples include function and class definitions. In contrast, executable code is code that performs specific actions. Some examples include function calls, loops, and conditionals.
In the following sections, you’ll learn how to run Python scripts, programs, and code in general. To kick things off, you’ll start by learning how to run them from your operating system’s command line or terminal.
How to Run Python Scripts From the Command LineIn Python programming, you’ll write programs in plain text files. By convention, files containing Python code use the .py extension, and there’s no distinction between scripts or executable programs and modules. All of them will use the same extension.
Note: On Windows systems, the extension can also be .pyw for those applications that should use the pythonw.exe launcher.
To create a Python script, you can use any Python-friendly code editor or IDE (integrated development environment). To keep moving forward in this tutorial, you’ll need to create a basic script, so fire up your favorite text editor and create a new hello.py file containing the following code:
Python hello.py print("Hello, World!") Copied!This is the classic "Hello, World!" program in Python. The executable code consists of a call to the built-in print() function that displays the "Hello, World!" message on your screen.
With this small program ready, you’re ready to learn different ways to run it. You’ll start by running the program from your command line, which is arguably the most commonly used approach to running scripts.
Using the python CommandTo run Python scripts with the python command, you need to open a command-line window and type in the word python followed by the path to your target script:
Windows PowerShell PS> python .\hello.py Hello, World! PS> py .\hello.py Hello, World! Copied! Shell $ python ./hello.py Hello, World! Copied! Read the full article at https://realpython.com/run-python-scripts/ »[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
Real Python: Using and Creating Global Variables in Your Python Functions
In Python, global variables are accessible across your entire program, including within functions. Understanding how Python handles global variables is key to writing efficient code. This tutorial will guide you through accessing and modifying global variables in Python functions using the global keyword and the globals() function. You’ll also learn to manage scope and avoid potential conflicts between local and global variables.
You’ll explore how to create global variables inside functions and apply strategies to minimize their use, ensuring your code remains clean and maintainable. After reading this tutorial, you’ll be adept at managing global variables and understanding their impact on your Python code.
By the end of this tutorial, you’ll understand that:
- A global variable in Python is a variable defined at the module level, accessible throughout the program.
- Accessing and modifying global variables inside Python functions can be achieved using the global keyword or the globals() function.
- Python handles name conflicts by searching scopes from local to built-in, potentially causing name shadowing challenges.
- Creating global variables inside a function is possible using the global keyword or globals(), but it’s generally not recommended.
- Strategies to avoid global variables include using constants, passing arguments, and employing classes and methods to encapsulate state.
To follow along with this tutorial, you should have a solid understanding of Python programming, including fundamental concepts such as variables, data types, scope, mutability, functions, and classes.
Get Your Code: Click here to download the free sample code that you’ll use to understand when and how to work with global variables in your Python functions.
Take the Quiz: Test your knowledge with our interactive “Using and Creating Global Variables in Your Python Functions” quiz. You’ll receive a score upon completion to help you track your learning progress:
Interactive Quiz
Using and Creating Global Variables in Your Python FunctionsIn this quiz, you'll test your understanding of how to use global variables in Python functions. With this knowledge, you'll be able to share data across an entire program, modify and create global variables within functions, and understand when to avoid using global variables.
Using Global Variables in Python FunctionsGlobal variables are those that you can access and modify from anywhere in your code. In Python, you’ll typically define global variables at the module level. So, the containing module is their scope.
Note: You can also define global variables inside functions, as you’ll learn in the section Creating Global Variables Inside a Function.
Once you’ve defined a global variable, you can use it from within the module itself or from within other modules in your code. You can also use global variables in your functions. However, those cases can get a bit confusing because of differences between accessing and modifying global variables in functions.
To understand these differences, consider that Python can look for variables in four different scopes:
- The local, or function-level, scope, which exists inside functions
- The enclosing, or non-local, scope, which appears in nested functions
- The global scope, which exists at the module level
- The built-in scope, which is a special scope for Python’s built-in names
To illustrate, say that you’re inside an inner function. In that case, Python can look for names in all four scopes.
When you access a variable in that inner function, Python first looks inside that function. If the variable doesn’t exist there, then Python continues with the enclosing scope of the outer function. If the variable isn’t defined there either, then Python moves to the global and built-in scopes in that order. If Python finds the variable, then you get the value back. Otherwise, you get a NameError:
Python >>> # Global scope >>> def outer_func(): ... # Non-local scope ... def inner_func(): ... # Local scope ... print(some_variable) ... inner_func() ... >>> outer_func() Traceback (most recent call last): ... NameError: name 'some_variable' is not defined >>> some_variable = "Hello from global scope!" >>> outer_func() Hello from global scope! Copied!When you launch an interactive session, it starts off at the module level of global scope. In this example, you have outer_func(), which defines inner_func() as a nested function. From the perspective of this nested function, its own code block represents the local scope, while the outer_func() code block before the call to inner_func() represents the non-local scope.
If you call outer_func() without defining some_variable in either of your current scopes, then you get a NameError exception because the name isn’t defined.
If you define some_variable in the global scope and then call outer_func(), then you get Hello! on your screen. Internally, Python has searched the local, non-local, and global scopes to find some_variable and print its content. Note that you can define this variable in any of the three scopes, and Python will find it.
This search mechanism makes it possible to use global variables from inside functions. However, while taking advantage of this feature, you can face a few issues. For example, accessing a variable works, but directly modifying a variable doesn’t work:
Python >>> number = 42 >>> def access_number(): ... return number ... >>> access_number() 42 >>> def modify_number(): ... number = 7 ... >>> modify_number() >>> number 42 Copied!The access_number() function works fine. It looks for number and finds it in the global scope. In contrast, modify_number() doesn’t work as expected. Why doesn’t this function update the value of your global variable, number? The problem is the scope of the variable. You can’t directly modify a variable from a high-level scope like global in a lower-level scope like local.
Internally, Python assumes that any name directly assigned within a function is local to that function. Therefore, the local name, number, shadows its global sibling.
In this sense, global variables behave as read-only names. You can access their values, but you can’t modify them.
Note: The discussion about modifying global variables inside functions revolves around assignment operations rather than in-place mutations of mutable objects. You’ll learn about the effects of mutability on global variables in the section Understanding How Mutability Affects Global Variables.
Read the full article at https://realpython.com/python-use-global-variable-in-function/ »[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
Real Python: Asynchronous Tasks With Django and Celery
Integrating Celery with your Django application allows you to offload time-consuming tasks, ensuring smooth user experiences. Celery is a distributed task queue that processes tasks asynchronously, preventing delays in your web app’s response time. By using Celery with Django, you can efficiently manage tasks like sending emails, processing images, and analyzing data without slowing down your application.
Celery works by leveraging a message broker like Redis to communicate between your Django app and Celery workers. This setup enables you to handle tasks outside the main execution thread, improving your app’s performance.
By the end of this tutorial, you’ll understand that:
- Celery is a distributed task queue that handles tasks outside the main Django app flow.
- Python’s Celery excels at offloading work and scheduling tasks independently.
- Using Celery in Django helps maintain app responsiveness during time-intensive tasks.
- Configuring Celery in Django involves setting up a message broker and defining tasks.
- A Celery worker performs tasks asynchronously, freeing up the main app.
- Running a task in Celery requires calling the task with .delay() or .apply_async().
- Celery is not a message queue but uses a message broker like Redis for communication.
You’re in the right place if you’ve never used Celery in a Django app before, or if you’ve peeked into Celery’s documentation but couldn’t find your way around. You’ll learn how to configure Celery in Django to handle tasks asynchronously, ensuring your application remains responsive and efficient.
To focus this tutorial on the essentials, you’ll integrate Celery into an existing Django app. Go ahead and download the code for that app so that you can follow along:
Get Your Code: Click here to download free the sample code you’ll use to integrate Celery into your Django app.
Python Celery BasicsCelery is a distributed task queue that can collect, record, schedule, and perform tasks outside of your main program.
Note: Celery dropped support for Windows in version 4, so while you may still be able to get it to work on Windows, you’re better off using a different task queue, such as huey or Dramatiq, instead.
In this tutorial, you’ll focus on using Celery on UNIX systems, so if you’re trying to set up a distributed task queue on Windows, then this might not be the right tutorial for you.
To receive tasks from your program and send results to a back end, Celery requires a message broker for communication. Redis and RabbitMQ are two message brokers that developers often use together with Celery.
In this tutorial, you’ll use Redis as the message broker. To challenge yourself, you can stray from the instructions and use RabbitMQ as a message broker instead.
If you want to keep track of the results of your task runs, then you also need to set up a results back end database.
Note: Connecting Celery to a results back end is optional. Once you instruct Celery to run a task, it’ll do its duty whether you keep track of the task result or not.
However, keeping a record of all task results is often helpful, especially if you’re distributing tasks to multiple queues. To persist information about task results, you need a database back end.
You can use many different databases to keep track of Celery task results. In this tutorial, you’ll work with Redis both as a message broker and as a results back end. By using Redis, you limit the dependencies that you need to install because it can take on both roles.
You won’t do any work with the recorded task results in the scope of this tutorial. However, as a next step, you could inspect the results with the Redis command-line interface (CLI) or pull information into a dedicated page in your Django project.
Why Use Celery?There are two main reasons why most developers want to start using Celery:
- Offloading work from your app to distributed processes that can run independently of your app
- Scheduling task execution at a specific time, sometimes as recurring events
Celery is an excellent choice for both of these use cases. It defines itself as “a task queue with focus on real-time processing, while also supporting task scheduling” (Source).
Even though both of these functionalities are part of Celery, they’re often addressed separately:
- Celery workers are worker processes that run tasks independently from one another and outside the context of your main service.
- Celery beat is a scheduler that orchestrates when to run tasks. You can use it to schedule periodic tasks as well.
Celery workers are the backbone of Celery. Even if you aim to schedule recurring tasks using Celery beat, a Celery worker will pick up your instructions and handle them at the scheduled time. What Celery beat adds to the mix is a time-based scheduler for Celery workers.
In this tutorial, you’ll learn how to integrate Celery with Django to perform operations asynchronously from the main execution thread of your app using Celery workers.
You won’t tackle task scheduling with Celery beat in this tutorial, but once you understand the basics of Celery tasks, you’ll be well equipped to set up periodic tasks with Celery beat.
Read the full article at https://realpython.com/asynchronous-tasks-with-django-and-celery/ »[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
Real Python: Effective Python Testing With pytest
pytest is a popular testing framework for Python that simplifies the process of writing and executing tests. To start using pytest, install it with pip in a virtual environment. pytest offers several advantages over unittest that ships with Python, such as less boilerplate code, more readable output, and a rich plugin ecosystem.
pytest comes packed with features to boost your productivity. Its fixtures allow for explicit dependency declarations, making tests more understandable and reducing implicit dependencies. Parametrization in pytest helps prevent redundant test code by enabling multiple test cases from a single test function definition. This framework is highly customizable, so you can tailor it to your project’s needs.
By the end of this tutorial, you’ll understand that:
- Using pytest requires installing it with pip in a virtual environment to set up the pytest command.
- pytest allows for less code, easier readability, and more features compared to unittest.
- Managing test dependencies and state with pytest is made efficient through the use of fixtures, which provide explicit dependency declarations.
- Parametrization in pytest helps avoid redundant test code by allowing multiple test scenarios from a single test function.
- Assertion introspection in pytest provides detailed information about failures in the test report.
Free Bonus: 5 Thoughts On Python Mastery, a free course for Python developers that shows you the roadmap and the mindset you’ll need to take your Python skills to the next level.
Take the Quiz: Test your knowledge with our interactive “Effective Testing with Pytest” quiz. You’ll receive a score upon completion to help you track your learning progress:
Interactive Quiz
Effective Testing with PytestIn this quiz, you'll test your understanding of pytest, a Python testing tool. With this knowledge, you'll be able to write more efficient and effective tests, ensuring your code behaves as expected.
How to Install pytestTo follow along with some of the examples in this tutorial, you’ll need to install pytest. As most Python packages, pytest is available on PyPI. You can install it in a virtual environment using pip:
Windows PowerShell PS> python -m venv venv PS> .\venv\Scripts\activate (venv) PS> python -m pip install pytest Copied! Shell $ python -m venv venv $ source venv/bin/activate (venv) $ python -m pip install pytest Copied!The pytest command will now be available in your installation environment.
What Makes pytest So Useful?If you’ve written unit tests for your Python code before, then you may have used Python’s built-in unittest module. unittest provides a solid base on which to build your test suite, but it has a few shortcomings.
A number of third-party testing frameworks attempt to address some of the issues with unittest, and pytest has proven to be one of the most popular. pytest is a feature-rich, plugin-based ecosystem for testing your Python code.
If you haven’t had the pleasure of using pytest yet, then you’re in for a treat! Its philosophy and features will make your testing experience more productive and enjoyable. With pytest, common tasks require less code and advanced tasks can be achieved through a variety of time-saving commands and plugins. It’ll even run your existing tests out of the box, including those written with unittest.
As with most frameworks, some development patterns that make sense when you first start using pytest can start causing pains as your test suite grows. This tutorial will help you understand some of the tools pytest provides to keep your testing efficient and effective even as it scales.
Less BoilerplateMost functional tests follow the Arrange-Act-Assert model:
- Arrange, or set up, the conditions for the test
- Act by calling some function or method
- Assert that some end condition is true
Testing frameworks typically hook into your test’s assertions so that they can provide information when an assertion fails. unittest, for example, provides a number of helpful assertion utilities out of the box. However, even a small set of tests requires a fair amount of boilerplate code.
Imagine you’d like to write a test suite just to make sure that unittest is working properly in your project. You might want to write one test that always passes and one that always fails:
Python test_with_unittest.py from unittest import TestCase class TryTesting(TestCase): def test_always_passes(self): self.assertTrue(True) def test_always_fails(self): self.assertTrue(False) Copied!You can then run those tests from the command line using the discover option of unittest:
Shell (venv) $ python -m unittest discover F. ====================================================================== FAIL: test_always_fails (test_with_unittest.TryTesting) ---------------------------------------------------------------------- Traceback (most recent call last): File "...\effective-python-testing-with-pytest\test_with_unittest.py", line 10, in test_always_fails self.assertTrue(False) AssertionError: False is not true ---------------------------------------------------------------------- Ran 2 tests in 0.006s FAILED (failures=1) Copied!As expected, one test passed and one failed. You’ve proven that unittest is working, but look at what you had to do:
Read the full article at https://realpython.com/pytest-python-testing/ »[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
Real Python: Python Timer Functions: Three Ways to Monitor Your Code
A timer is a powerful tool for monitoring the performance of your Python code. By using the time.perf_counter() function, you can measure execution time with exceptional precision, making it ideal for benchmarking. Using a timer involves recording timestamps before and after a specific code block and calculating the time difference to determine how long your code took to run.
In this tutorial, you’ll explore three different approaches to implementing timers: classes, decorators, and context managers. Each method offers unique advantages, and you’ll learn when and how to use them to achieve optimal results. Plus, you’ll have a fully functional Python timer that can be applied to any program to measure execution time efficiently.
By the end of this tutorial, you’ll understand that:
- time.perf_counter() is the best choice for accurate timing in Python due to its high resolution.
- You can create custom timer classes to encapsulate timing logic and reuse it across multiple parts of your program.
- Using decorators lets you seamlessly add timing functionality to existing functions without altering their code.
- You can leverage context managers to neatly measure execution time in specific code blocks, improving both resource management and code clarity.
Along the way, you’ll gain deeper insights into how classes, decorators, and context managers work in Python. As you explore real-world examples, you’ll discover how these concepts can not only help you measure code performance but also enhance your overall Python programming skills.
Decorators Q&A Transcript: Click here to get access to a 25-page chat log from our Python decorators Q&A session in the Real Python Community Slack where we discussed common decorator questions.
Python TimersFirst, you’ll take a look at some example code that you’ll use throughout the tutorial. Later, you’ll add a Python timer to this code to monitor its performance. You’ll also learn some of the simplest ways to measure the running time of this example.
Python Timer FunctionsIf you check out the built-in time module in Python, then you’ll notice several functions that can measure time:
Python 3.7 introduced several new functions, like thread_time(), as well as nanosecond versions of all the functions above, named with an _ns suffix. For example, perf_counter_ns() is the nanosecond version of perf_counter(). You’ll learn more about these functions later. For now, note what the documentation has to say about perf_counter():
Return the value (in fractional seconds) of a performance counter, i.e. a clock with the highest available resolution to measure a short duration. (Source)
First, you’ll use perf_counter() to create a Python timer. Later, you’ll compare this with other Python timer functions and learn why perf_counter() is usually the best choice.
Example: Download TutorialsTo better compare the different ways that you can add a Python timer to your code, you’ll apply different Python timer functions to the same code example throughout this tutorial. If you already have code that you’d like to measure, then feel free to follow the examples with that instead.
The example that you’ll use in this tutorial is a short function that uses the realpython-reader package to download the latest tutorials available here on Real Python. To learn more about the Real Python Reader and how it works, check out How to Publish an Open-Source Python Package to PyPI. You can install realpython-reader on your system with pip:
Shell $ python -m pip install realpython-reader Copied!Then, you can import the package as reader.
You’ll store the example in a file named latest_tutorial.py. The code consists of one function that downloads and prints the latest tutorial from Real Python:
Python latest_tutorial.py 1from reader import feed 2 3def main(): 4 """Download and print the latest tutorial from Real Python""" 5 tutorial = feed.get_article(0) 6 print(tutorial) 7 8if __name__ == "__main__": 9 main() Copied!realpython-reader handles most of the hard work:
- Line 1 imports feed from realpython-reader. This module contains functionality for downloading tutorials from the Real Python feed.
- Line 5 downloads the latest tutorial from Real Python. The number 0 is an offset, where 0 means the most recent tutorial, 1 is the previous tutorial, and so on.
- Line 7 prints the tutorial to the console.
- Line 9 calls main() when you run the script.
When you run this example, your output will typically look something like this:
Shell $ python latest_tutorial.py # Python Timer Functions: Three Ways to Monitor Your Code A timer is a powerful tool for monitoring the performance of your Python code. By using the `time.perf_counter()` function, you can measure execution time with exceptional precision, making it ideal for benchmarking. Using a timer involves recording timestamps before and after a specific code block and calculating the time difference to determine how long your code took to run. [ ... ] ## Read the full article at https://realpython.com/python-timer/ » * * * Copied! Read the full article at https://realpython.com/python-timer/ »[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
This Week in KDE Apps: Gear 24.12.0 incoming
Welcome to a new issue of "This Week in KDE Apps"! Every week we cover as much as possible of what's happening in the world of KDE apps.
This week, we are adding the final touches to our applications for the KDE Gear 24.12.0 release coming next Thursday. We are also releasing KPhotoAlbum and KGeoTag, now based on Qt6; improving Itinerary's ticket extractor support coverage in central Europe; and continuing our work on Karp, KDE's new PDF editor.
Meanwhile, as part of the 2024 end-of-year fundraiser, you can "Adopt an App" in a symbolic effort to support your favorite KDE app. This week, we are particularly grateful to Stuart Turton for NeoChat; Lukas, Stuart Turton and J. for Merkuro; Andreas Pietzowski, Dia_FIX and Alex Gurenko for Ark; Stuart Turton and Cameron Bosch for Tokodon; Alex Gurenko and Steven Dunbar for Gwenview; Alex Gurenko, Kasimir den Hertog and Pokipan for KWrite; crysknife, Ian Kidd and Felix Urbasik for KRDC; Ian Nicholson for Alligator; Cameron Radmore for ISO Image Writer; Marcel Janik Kimpel and @siriusfox@social.treehouse.systems for KDE Partition Manager; Marton Daniel for Plasma System Monitor; Alessio Adamo for AudioTube; zukigay for Kasts; Anael for Elisa; Stuart Turton and Clément Aubert for Konqueror; Ulrich Palecek, @ddjivan.bsky.social and Andreas Zautner for Discover; Butters for KolourPaint; KjetilS for krfb; and finally fabacam, Michael Klingberg and Gianmarco Gargiulo for GCompris.
Getting back to all that's new in the KDE App scene, let's dig in!
Akonadi Background service for KDE PIM appsWhen updating, adding, or removing a tag/category to a calendar event, the update is immediately visible without having to sync again with a remote server (Daniel Vrátil, 24.12.0 — Link).
Alligator RSS feed readerThe "Refresh" action is now also visible on mobile (Mark Penner, 24.12.0 — Link).
Amarok A powerful music player that lets you rediscover your musicA beta release of the upcoming Amarok 3.2 music player is out for testing — see the announcement email.
Amarok devs fixed Ampache version check. Ampache is self-hostable music streamer service server and the version check was broken since Ampache changed their version format, but it works again now (Ian Abbott, Amarok 3.2.0 — Link).
You can also filter a collection by tracks that have tags missing or when tags are empty (Tuomas Nurmi, Amarok 3.2.0 — Link — an 11 year old feature request!).
Arianna EBook readerArianna now uses foliate-js instead of epub.js to render EPUB files. foliate-js provides some advantages like no longer requiring to load the whole book into memory, and comes with a better layout engine (Ajay Chauhan, 25.04.0 — Link).
AudioTube YouTube Music appAudioTube now displays album information in the maximized player view (Kavinu Nethsara, 25.04.0 — Link).
Dolphin Manage your filesAccessibility support in Dolphin was adapted to better work with Orca 47 (Felix Ernst, 25.04.0 — Link), and, continuing with accessibility improvements, after activating a folder in the Dolphin sidebar, the view is now always focused (Felix Ernst, 25.04.0 — Link). Likewise, when clicking on "Open Path" and "Open Path in New Tab" after searching for an item, the view will scroll to the selected item (Akseli Lahtinen, 25.04.0 — Link).
The placeholder message when Samba isn't and can't be installed was improved (Ilya Katsnelson, 25.04.0. and partially backported to 24.12.0 — Link), and the Flatpak version now allows compressing files into an archive (Justin Zobel, 25.04.0 — Link).
Elisa Play local music and listen to online radioWhen removing the last track associated with an artist or a music genre, the artist or genre is now removed from the internal database (Jack Hill, 25.04.0 — Link).
Gwenview Image ViewerWe fixed the incorrect numbering in full screen mode for the first image (Pedro Hernandez, 25.04.0 — Link).
KDE Itinerary Digital travel assistantVolker wrote a recap for the past two months in Itinerary and can read it on his blog. The post includes a report on work unrelated to Itinerary development, but nevertheless important, like the lobbying of DELFI, a cooperation network of all German federal states for public transport.
The "Vehicle Layout" page and the "Journey Details" page were slightly tweaked and use the new unified component to display the name of the train or bus (Carl Schwan, 25.04.0 — Link 1 and link 2).
We also made significant progress on Itinerary's extractors this week, with many new extractors, including:
- The Colosseum Ticket in Rome (David Pilarcik, Link)
- The Polish online ticket sale system Droplabs (David Pilarcik, Link)
- The train booking platform Leo Express (David Pilarcik, Link)
- The German trade fair, congress, and event ticket sale system, Dimedis Fairmate (Kai Uwe Broulik, Link)
- Google Maps links (Kai Uwe Broulik, Link)
- The European Sleeper seat reservations in French (Luca Weiss, Link)
Kaidan will now display a map preview by default when receiving a geo location (Melvin Keskin — Link).
Karp KDE arranger for PDFsKarp, KDE's new PDF editor, received visual improvements to its main interface (Carl Schwan — Link 1, link 2 and link 3).
And Nicolas setup crash reporting for Karp (Nicolas Fella — Link).
KDE Connect Seamless connection of your devicesWe moved the list of devices to the sidebar in an effort to bring the app to parity with the KCM (Darshan Phaldesai, 25.04.0 — Link).
We also added icons to the plugin config list (Leia uwu, 25.04.0 — Link).
For the bluetooth backend, we improved the speed of transferring data between devices (ivan tkachenko, 25.04.0 — Link).
KGeoTag Photo geotagging programKGeoTag 1.7.0 is out! This release brings Qt6 support to the app. Read the full announcement.
KPhotoAlbum KDE image management softwareKPhotoAlbum 6.0.0 is out! This release also brings Qt6 support to the app. Read the full announcement.
Konsole Use the command line interfaceIt is now possible to resize Konsole's search bar (Eric D'Addario, 25.04.0 — Link), and to search for an open tab by its name (Troy Hoover, 25.04.0 — Link).
Kongress Conference companionWe now display the speaker's name (if available) in the talk info (Volker Krause, 25.04.0 — Link).
Kleopatra Certificate manager and cryptography appWe fixed a crash when the output directory for decrypting doesn't exist (Tobias Fella, 24.12.1 — Link).
KRDC Connect with RDP or VNC to another computerWe fixed the "Grab Keys" feature on Wayland when switching from and to full screen. Additionally the "Grab Keys" feature, now also correctly forwards every shortcut to the remote applications (Fabio Bas, 25.04.0 — Link 1 and link 2).
We also fixed building KRDC on Haiku (Luc Schrijvers, 24.12.1 — Link).
NeoChat Chat on MatrixYou can now sort rooms in the sidebar based on their most recent activity instead of by unread notifications (Soumyadeep Ghosh, 25.04.0 — Link), and added a "Copy Link Address" context menu when clicking on a message (Kai Uwe Broulik, 25.04.0 — Link).
We fixed the capitalization of the account dialog as well as many of NeoChat's settings pages (Joshua Goins, 25.04.0 — Link), and removed device details from the device display name, as this could leak sensitive information (Tobias Fella, 24.12.0 — Link).
Okular View and annotate documentsWhen creating a new signature, Okular will automatically pick a font size depending on the available size instead of using a hardcoded size. This allows you to make signatures much smaller than before (Nicolas Fella, 25.04.0 — Link). This work was sponsored by the Technische Universität Dresden.
PlasmaTube Watch YouTube videosWe improved the support for Piped, an alternative privacy-friendly YouTube frontend, in PlasmaTube, as we have improved the parsing of its media format information (Alexey Andreyev, 25.04.0 — Link).
Tokodon Browse the FediverseThe Mastodon client used when posting on Mastodon is now displayed as a Kirigami.Chip element (Joshua Goins, 25.04.0 — Link).
We also fixed the support for GoToSocial (snow flurry, 24.12.0 — Link 1 and link 2), and added prelimary support for Iceshrimp (Joshua Goins, 24.12.0 — Link)).
Spectacle Screenshot Capture UtilitySpectacle can now export to an animated WebP or a GIF (Noah Davis, 25.04.0 — Link).
…And Everything ElseThis blog only covers the tip of the iceberg! If you’re hungry for more, check out Nate's blog about Plasma and be sure not to miss his This Week in Plasma series, where every Saturday he covers all the work being put into KDE's Plasma desktop environment.
For a complete overview of what's going on, visit KDE's Planet, where you can find all KDE news unfiltered directly from our contributors.
Get InvolvedThe KDE organization has become important in the world, and your time and contributions have helped us get there. As we grow, we're going to need your support for KDE to become sustainable.
You can help KDE by becoming an active community member and getting involved. Each contributor makes a huge difference in KDE — you are not a number or a cog in a machine! You don’t have to be a programmer either. There are many things you can do: you can help hunt and confirm bugs, even maybe solve them; contribute designs for wallpapers, web pages, icons and app interfaces; translate messages and menu items into your own language; promote KDE in your local community; and a ton more things.
You can also help us by donating. Any monetary contribution, however small, will help us cover operational costs, salaries, travel expenses for contributors and in general just keep KDE bringing Free Software to the world.
To get your application mentioned here, please ping us in invent or in Matrix.
Thanks to Michael Mischurow and Tobias Fella for proofreading this post.
Paulo Henrique de Lima Santana: Bits from MiniDebConf Toulouse 2024
I always find it amazing the opportunities I have thanks to my contributions to the Debian Project. I am happy to receive this recognition through the help I receive with travel to attend events in other countries.
This year, two MiniDebConfs were scheduled for the second half of the year in Europe: the traditional edition in Cambridge in UK and a new edition in Toulouse in France. After weighing the difficulties and advantages that I would have to attend one of them, I decided to choose Toulouse, mainly because it was cheaper and because it was in November, giving me more time to plan the trip. I contacted the current DPL Andreas Tille explaining my desire to attend the event and he kindly approved my request for Debian to pay for the tickets. Thanks again to Andreas!
MiniDebConf Toulouse 2024 was held in November 16th and 17th (Saturday and Sunday) and took place in one of the rooms of a traditional Free Software event in the city named Capitole du Libre. Before MiniDebConf, the team organized a MiniDebCamp in November 14th and 15th at a coworking space.
The whole experience promised to be incredible, and it was! From visiting a city in France for the first time, to attending a local Free Software event, and sharing four days with people from the Debian community from various countries.
Travel and the cityMy plan was to leave Belo Horizonte on Monday, pass through São Paulo, and arrive in Toulouse on Tuesday night. I was going to spend the whole of Wednesday walking around the city and then participate in the MiniDebCamp on Thursday.
But the flight that was supposed to leave São Paulo in the early hours of Monday to Tuesday was cancelled due to a problem with the airplane and I had spent all Tuesday waiting. I was rebooked on another flight that left in the evening and arrived in Toulouse on Wednesday afternoon. Even though I was very tired from the trip, I still took advantage of the end of the day to walk around the city. But it was a shame to have lost an entire day of sightseeing.
On Thursday I left early in the morning to walk around a little more before going to the MiniDebCamp venue. I walked around a lot and saw several tourist attractions. The city is really very beautiful, as they say, especially the houses and buildings made of pink bricks. I was impressed by the narrow and winding streets; at one point it seemed like I was walking through a maze. I arrived to a corner and there would be 5 streets crossing in different directions.
The riverbank that runs through the city is very beautiful and people spend their evenings there just hanging out. There was a lot of history around there.
I stayed in an airbnb 25 minutes walking from the coworking space and only 10 minutes from the event venue. It was a very spacious apartment that was much cheaper than a hotel.
I arrived at the coworking space where the MiniDebCamp was being held and met up with several friends. I also met some new people, talked about the translation work we do in Brazil, and other topics.
We already knew that the organization would pay for lunch for everyone during the two days of MiniDebCamp, and at a certain point they told us that we could go to the room (which was downstairs from the coworking space) to have lunch. They set up a table with quiches, breads, charcuterie and LOTS of cheese :-) There were several types of cheese and they were all very good. I just found it a little strange because I’m not used to having cheese for lunch :-)
In the evening, we went as a group to dinner at a restaurant in front of the Capitolium, the city’s main tourist attraction.
On the second day, in the morning, I walked around the city a bit more, then went to the coworking space and had another incredible cheese table for lunch.
Video TeamOne of my ideas for going to Toulouse was to be able to help the video team in setting up the equipment for broadcasting and recording the talks. I wanted to follow this work from the beginning and learn some details, something I can’t do before the DebConfs because I always arrive after the people have already set up the infrastructure. And later reproduce this work in the MiniDebConfs in Brazil, such as the one in Maceió that is already scheduled for May 1-4, 2025.
As I had agreed with the people from the video team that I would help set up the equipment, on Friday night we went to the University and stayed in the room working. I asked several questions about what they were doing, about the equipment, and I was able to clear up several doubts. Over the next two days I was handling one of the cameras during the talks. And on Sunday night I helped put everything away.
Thanks to olasd, tumbleweed and ivodd for their guidance and patience.
The event in generalThere was also a meeting with some members of the publicity team who were there with the DPL. We went to a cafeteria and talked mainly about areas that could be improved in the team.
The talks at MiniDebConf were very good and the recordings are also available here.
I ended up not watching any of the talks from the general schedule at Capitole du Libre because they were in French. It’s always great to see free software events abroad to learn how they are done there and to bring some of those experiences to our events.
I hope that MiniDebConf in Toulouse will continue to take place every year, or that the French community will hold the next edition in another city and I will be able to join again :-) If everything goes well, in July next year I will return to France to join DebConf25 in Brest.
LostCarPark Drupal Blog: Drupal Advent Calendar day 8 - SEO
Today we are looking at another aspect of Drupal Starshot that may not generate a lot of excitement, but will make it a lot easier for the average marketer or Drupal site builder to make their site perform well and be easy to find.
Search Engine Optimization (SEO) has often been treated as something of a dark art. Any number of self-proclaimed masters of this art will promise to take your site to new levels, their ability to deliver varies greatly.
Drupal CMS aims to provide ready-to-use tools to help improve search engine performance, and “SEO Tools” is one of the “goals” offered during the…
TagsRuss Allbery: Review: Why Buildings Fall Down
Review: Why Buildings Fall Down, by Matthys Levy & Mario Salvadori
Illustrator: Kevin Woest Publisher: W.W. Norton Copyright: 1992 Printing: 1994 ISBN: 0-393-31152-X Format: Trade paperback Pages: 314Why Buildings Fall Down is a non-fiction survey of the causes of structure collapses, along with some related topics. It is a sequel of sorts to Why Buildings Stand Up by Mario Salvadori, which I have not read. Salvadori was, at the time of writing, Professor Emeritus of Architecture at Columbia University (he died in 1997). Levy is an award-winning architectural engineer, and both authors were principals at the structural engineering firm Weidlinger Associates. There is a revised and updated 2002 edition, but this review is of the original 1992 edition.
This is one of those reviews that comes with a small snapshot of how my brain works. I got fascinated by the analysis of the collapse of Champlain Towers South in Surfside, Florida in 2021, thanks largely to a random YouTube series on the tiny channel of a structural engineer. Somewhere in there (I don't remember where, possibly from that channel, possibly not) I saw a recommendation for this book and grabbed a used copy in 2022 with the intent of reading it while my interest was piqued. The book arrived, I didn't read it right away, I got distracted by other things, and it migrated to my shelves and sat there until I picked it up on an "I haven't read nonfiction in a while" whim.
Two years is a pretty short time frame for a book to sit on my shelf waiting for me to notice it again. The number of books that have been doing that for several decades is, uh, not small.
Why Buildings Fall Down is a non-technical survey of structure failures. These are mostly buildings, but also include dams, bridges, and other structures. It's divided into 18 fairly short chapters, and the discussion of each disaster is brisk and to the point. Most of the structures discussed are relatively recent, but the authors talk about the Meidum Pyramid, the Parthenon (in the chapter on intentional destruction by humans), and the Pavia Civic Tower (in the chapter about building death from old age). If you are someone who has already been down the structural failure rabbit hole, you will find chapters on the expected disasters like the Tacoma Narrows Bridge collapse and the Hyatt Regency walkway collapse, but there are a lot of incidents here, including a short but interesting discussion of the Leaning Tower of Pisa in the chapter on problems caused by soil properties.
What you're going to get, in other words, is a tour of ways in which structures can fail, which is precisely what was promised by the title. This wasn't quite what I was expecting, but now I'm not sure why I was expecting something different. There is no real unifying theme here; sometimes the failure was an oversight, sometimes it was a bad design, sometimes it was a last-minute change, and sometimes it was something unanticipated. There are a lot of factors involved in structure design and any of them can fail. The closest there is to a common pattern is a lack of redundancy and sufficient safety factors, but that lack of redundancy was generally not deliberate and therefore this is not a guide to preventing a collapse. The result is a book that feels a bit like a grab-bag of structural trivia that is individually interesting but only occasionally memorable.
The writing style I suspect will be a matter of taste, but once I got used to it, I rather enjoyed it. In a co-written book, it's hard to separate the voices of the authors, but Salvadori wrote most of the chapter on the law in the first person and he's clearly a character. (That chapter is largely the story of two trials he testified in, which, from his account, involved him verbally fencing with lawyers who attempted to claim his degrees from the University of Rome didn't count as real degrees.) If this translates to his speaking style, I suspect he was a popular lecturer at Columbia.
The explanations of the structural failures are concise and relatively clear, although even with Kevin Woest's diagrams, it's hard to capture the stresses and movement in a written description. (I've found from watching YouTube videos that animations, or even annotations drawn while someone is talking, help a lot.) The framing discussion, well, sometimes that is bombastic in a way that I found amusing:
But we, children of a different era, do not want our lives to be enclosed, to be shielded from the mystery. We are eager to participate in it, to gather with our brothers and sisters in a community of thought that will lift us above the mundane. We need to be together in sorrow and in joy. Thus we rarely build monolithic monuments. Instead, we build domes.
It helps that passages like this are always short and thus don't wear out their welcome. My favorite line in the whole book is a throwaway sentence in a discussion of building failures due to explosions:
With a similar approach, it can be estimated that the chance of an explosion like that at Forty-fifth Street was at most one in thirty million, and probably much less. But this is why life is dangerous and always ends in death.
Going hard, structural engineering book!
It's often appealing to learn about things from their failures because the failures are inherently more dramatic and thus more interesting, but if you were hoping for an introduction to structural engineering, this is probably not the book you want. There is an excellent and surprisingly engaging appendix that covers the basics of structural analysis in 45 pages, but you would probably be better off with Why Buildings Stand Up or another architecture or structural engineering textbook (or maybe a video course). The problem with learning by failure case study is that all the case studies tend to blend together, despite the authors' engaging prose, and nearly every collapse introduces a new structural element with new properties and new failure modes and only the briefest of explanations. This book might make you a slightly more informed consumer of the news, but for most readers I suspect it will be a collection of forgettable trivia told in an occasionally entertaining style.
I think the book I wanted to read was something that went deeper into the process of forensic engineering, not just the outcomes. It's interesting to know what the cause of a failure was, but I'm more interested in how one goes about investigating a failure. What is the process, how do you organize the investigation, and how does the legal system around engineering failures work? There are tidbits and asides here, but this book is primarily focused on the structural analysis and elides most of the work done to arrive at those conclusions.
That said, I was entertained. Why Buildings Fall Down is a bit dated — the opening chapter on airplanes hitting buildings reads much differently now than when it was written in 1992, and I'm sure it was updated in the 2002 edition — but it succeeds in being clear without being soulless or sounding like a textbook. I appreciate an occasional rant about nuclear weapons in a book about architecture. I'm not sure I really recommend this, but I had a good time with it.
Also, I'm now looking for opportunities to say "this is why life is dangerous and always ends in death," so there is that.
Rating: 6 out of 10
Dominique Dumont: New cme command to update Debian Standards-Version field
Hi
While updating my Debian package, I often have to update a field from debian/control file.
This field is named Standards-Version and it declares which version of Debian policy the package complies to. When updating this field, one must follow the upgrading checklist.
That being said, I maintain a lot of similar package and I often have to update this Standards-Version field.
This field can be updated manually with cme fix dpkg (see Managing Debian packages with cme). But this command may make other changes and does not commit the result.
So I’ve created a new update-standards-version cme script that:
- udpate Standards-Version field
- commit the changed
For instance:
$ cme run update-standards-version Reading package lists... Done Building dependency tree... Done Reading state information... Done Connecting to api.ftp-master.debian.org to check 31 package versions. Please wait... Got info from api.ftp-master.debian.org for 31 packages. Warning in 'source Standards-Version': Current standards version is '4.7.0'. Please read https://www.debian.org/doc/debian-policy/upgrading-checklist.html for the changes that may be needed on your package to upgrade it from standard version '4.6.2' to '4.7.0'. Offending value: '4.6.2' Changes applied to dpkg-control configuration: - source Standards-Version: '4.6.2' -> '4.7.0' [master 552862c1] control: declare compliance with Debian policy 4.7.0 1 file changed, 1 insertion(+), 1 deletion(-)Here’s the generated commit. Note that the generated log mentions the new policy version:
domi@ylum:~/private/debian-dev/perl-stuff/libconfig-model-perl$ git show commit 552862c1f24479b1c0c8c35a6289557f65e8ff3b (HEAD -> master) Author: Dominique Dumont <dod[at]debian.org> Date: Sat Dec 7 19:06:14 2024 +0100 control: declare compliance with Debian policy 4.7.0 diff --git a/debian/control b/debian/control index cdb41dc0..e888012e 100644 --- a/debian/control +++ b/debian/control @@ -48,7 +48,7 @@ Build-Depends-Indep: dh-sequence-bash-completion, libtext-levenshtein-damerau-perl, libyaml-tiny-perl, po-debconf -Standards-Version: 4.6.2 +Standards-Version: 4.7.0 Vcs-Browser: https://salsa.debian.org/perl-team/modules/packages/libconfig-model-perl Vcs-Git: https://salsa.debian.org/perl-team/modules/packages/libconfig-model-perl.git Homepage: https://github.com/dod38fr/config-model/wiki
Notes:
- this script can run only if there’s not pending change. Please commit or stash these changes before running this script.
- this script requires:
- cme >= 1.041
- libconfig-model-perl >= 2.155
- libconfig-model-dpkg-perl >= 3.006
I hope this will be useful to all my fellow Debian developers to reduce the boring parts of packaging activities.
All the best
Freelock Blog: Remind customers of abandoned carts
Sometimes a simple reminder can spur a sale. If you have repeat customers that log into your commerce site, you may be able to remind them if they did not complete a checkout.