Feeds

Kay Hayen: Nuitka Release 2.5

Planet Python - Tue, 2024-11-26 18:00

This is to inform you about the new stable release of Nuitka. It is the extremely compatible Python compiler, “download now”.

This release focused on Python 3.13 support, but also on improved compatibility, made many performance optimizations, enhanced error reporting, and better debugging support.

Bug Fixes
  • Windows: Fixed a regression in onefile mode that incorrectly handled program and command line paths containing spaces. Fixed in 2.4.4 already.

  • Windows: Corrected an issue where console output handles were being opened with closed file handles. Fixed in 2.4.2 already.

  • Standalone: Restored the ability to use trailing slashes on the command line to specify the target directory for data files on Windows. Fixed in 2.4.2 already.

  • Compatibility: Fixed a parsing error that occurred with relative imports in .pyi files, which could affect some extension modules with available source code. Fixed in 2.4.3 already.

  • Modules: Ensured that extension modules load correctly into packages when using Python 3.12. Fixed in 2.4.4 already.

  • Windows: Improved command line handling for onefile mode to ensure full compatibility with quoting. Fixed in 2.4.4 already.

  • Data Directories: Allowed the use of non-normalized paths on the command line when specifying data directories. Fixed in 2.4.5 already.

  • Python 3.11+: Fixed an issue where inspect module functions could raise StopIteration when examining compiled functions on the stack. Fixed in 2.4.5 already.

  • importlib_metadata: Improved compatibility with importlib_metadata by handling cases where it might be broken, preventing potential compilation crashes. Fixed in 2.4.5 already.

  • Plugins: Fixed a crash that occurred when using the no_asserts YAML configuration option. Fixed in 2.4.6 already.

  • Scons: Improved error tolerance when reading ccache log files to prevent crashes on Windows caused by non-ASCII module names or paths. Fixed in 2.4.11 already.

  • Scons: Prevented the C standard option from being applied to C++ compilers, resolving an issue with the splash screen on Windows when using Clang. Fixed in 2.4.8 already.

  • macOS: Enhanced handling of DLL self-dependencies to accommodate cases where DLLs use both .so and .dylib extensions for self-references. Fixed in 2.4.8 already.

  • Compatibility: Fixed a memory leak that occurred when using deepcopy on compiled methods. Fixed in 2.4.9 already.

  • MSYS2: Excluded the bin directory from being considered a system DLL folder when determining DLL inclusion. Fixed in 2.4.9 already.

  • Python 3.10+: Fixed a crash that could occur when a match statement failed to match a class with arguments. Fixed in 2.4.9 already.

  • MSYS2: Implemented a workaround for non-normalized paths returned by os.path.normpath in MSYS2 Python environments. Fixed in 2.4.11 already.

  • Python 3.12: Resolved an issue where Nuitka’s constant code was triggering assertions in Python 3.12.7. Fixed in 2.4.10 already.

  • UI: Ensured that the --include-package option includes both Python modules and extension modules that are sub-modules of the specified package. Fixed in 2.4.11 already.

  • Windows: Prevented encoding issues with CMD files used for accelerated mode on Windows.

  • Standalone: Improved the standard library scan to avoid assuming the presence of specific files, which might have been deleted by the user or a Python distribution.

  • Compatibility: Added suffix, suffixes, and stem attributes to Nuitka resource readers to improve compatibility with file objects.

  • Compatibility: Backported the error message change for yield from used at the module level, using dynamic detection instead of hardcoded text per version.

  • Compatibility: Fixed an issue where calling built-in functions with keyword-only arguments could result in errors due to incorrect argument passing.

  • Compatibility: Fixed reference leaks that occurred when using list.insert and list.index with 2 or 3 arguments.

  • Windows: Prioritized relative paths over absolute paths for the result executable when absolute paths are not file system encodable. This helps address issues related to non-ASCII short paths on some Chinese systems.

  • Compatibility: Improved compatibility with C extensions by handling cases where the attribute slot is not properly implemented, preventing potential segfaults.

  • Compatibility: Prevent the leakage of sys.frozen when using the multiprocessing module and its plugin, resolving a long-standing TODO and potentially breaking compatibility with packages that relied on this behavior.

  • Compatibility: Fixed an issue where matching calls with keyword-only arguments could lead to incorrect optimization and argument passing errors.

  • Compatibility: Corrected the handling of iterators in for loops to avoid assuming the presence of slots, preventing potential issues.

  • macOS: Added support for cyclic DLL dependencies, where DLLs have circular references.

  • Compatibility: Ensured the use of updated expressions during optimization phase for side effects to prevent crashes caused by referencing obsolete information.

  • Python 3.10+: Fixed a crash that could occur in complex cases when re-formulating match statements.

  • Python 3.4-3.5: Corrected an issue in Nuitka’s custom PyDict_Next implementation that could lead to incorrect results in older Python 3 versions.

  • Python 3.10+: Ensured that AttributeError is raised with the correct keyword arguments, avoiding a TypeError that occurred previously.

  • Plugins: Added a data file function that avoids loading packages, preventing potential crashes caused by incompatible dependencies (e.g., numpy versions).

  • Compatibility: Ensured that Nuitka’s package reader closes data files after reading them to prevent resource warnings in certain Python configurations.

  • Standalone: Exposed setuptools contained vendor packages in standalone distributions to match the behavior of the setuptools package.

  • Accelerated Mode: Enabled the django module parameter in accelerated mode to correctly detect used extensions.

  • Compatibility: Prevented resource warnings for unclosed files when trace outputs are sent to files via command line options.

  • Compatibility: Enabled the use of xmlrpc.server without requiring the pydoc module.

  • Plugins: Fixed an issue in the anti-bloat configuration where change_function and change_classes ignored “when” clauses, leading to unintended changes.

  • Python 3.12 (Linux): Enhanced static libpython handling for Linux. Static libpython is now used only when the inline copy is available (not in official Debian packages). The inline copy of hacl is used for all Linux static libpython uses with Python 3.12 or higher.

  • Standalone: Further improved the standard library scan to avoid assuming the presence of files that might have been manually deleted.

  • UI: Fixed the --include-raw-dir option, which was not functioning correctly. Only the Nuitka Package configuration was being used previously.

Package Support
  • arcade: Improved standalone configuration for the arcade package. Added in 2.4.3 already.

  • license-expression: Added a missing data file for the license-expression package in standalone distributions. Added in 2.4.6 already.

  • pydantic: Included a missing implicit dependency required for deprecated decorators in the pydantic package to function correctly in standalone mode. Fixed in 2.4.5 already.

  • spacy: Added a missing implicit dependency for the spacy package in standalone distributions. Added in 2.4.7 already.

  • trio: Updated standalone support for newer versions of the trio package. Added in 2.4.8 already.

  • tensorflow: Updated standalone support for newer versions of the tensorflow package. Added in 2.4.8 already.

  • pygame-ce: Added standalone support for the pygame-ce package. Added in 2.4.8 already.

  • toga: Added standalone support for newer versions of the toga package on Windows. Added in 2.4.9 already.

  • django: Implemented a workaround for a django debug feature that attempted to extract column numbers from compiled frames. Added in 2.4.9 already.

  • PySide6: Improved standalone support for PySide6 on macOS by allowing the recognition of potentially unusable plugins. Added in 2.4.9 already.

  • polars: Added a missing dependency for the polars package in standalone distributions. Added in 2.4.9 already.

  • django: Enhanced handling of cases where the django settings module parameter is absent in standalone distributions. Added in 2.4.9 already.

  • win32ctypes: Included missing implicit dependencies for win32ctypes modules on Windows in standalone distributions. Added in 2.4.9 already.

  • arcade: Added a missing data file for the arcade package in standalone distributions. Added in 2.4.9 already.

  • PySide6: Allowed PySide6 extras to be optional on macOS in standalone distributions, preventing complaints about missing DLLs when they are not installed. Added in 2.4.11 already.

  • driverless-selenium: Added standalone support for the driverless-selenium package. Added in 2.4.11 already.

  • tkinterdnd2: Updated standalone support for newer versions of the tkinterdnd2 package. Added in 2.4.11 already.

  • kivymd: Updated standalone support for newer versions of the kivymd package. Added in 2.4.11 already.

  • gssapi: Added standalone support for the gssapi package. Added in 2.4.11 already.

  • azure.cognitiveservices.speech: Added standalone support for the azure.cognitiveservices.speech package on macOS.

  • mne: Added standalone support for the mne package.

  • fastapi: Added a missing dependency for the fastapi package in standalone distributions.

  • pyav: Updated standalone support for newer versions of the pyav package.

  • py_mini_racer: Added standalone support for the py_mini_racer package.

  • keras: Improved standalone support for keras by extending its sub-modules path to include the keras.api sub-package.

  • transformers: Updated standalone support for newer versions of the transformers package.

  • win32com.server.register: Updated standalone support for newer versions of the win32com.server.register package.

  • Python 3.12+: Added support for distutils in setuptools for Python 3.12 and later.

  • cv2: Enabled automatic scanning of missing implicit imports for the cv2 package in standalone distributions.

  • lttbc: Added standalone support for the lttbc package.

  • win32file: Added a missing dependency for the win32file package in standalone distributions.

  • kivy: Fixed an issue where the kivy clipboard was not working on Linux due to missing dependencies in standalone distributions.

  • paddleocr: Added missing data files for the paddleocr package in standalone distributions.

  • playwright: Added standalone support for the playwright package with a new plugin.

  • PySide6: Allowed PySide6 extras to be optional on macOS in standalone distributions, preventing complaints about missing DLLs when they are not installed.

New Features
  • Python 3.13: Added experimental support for Python 3.13.

    Warning

    Python 3.13 support is not yet recommended for production use due to limited testing. On Windows, only MSVC and ClangCL are currently supported due to workarounds needed for incompatible structure layouts.

  • UI: Introduced a new --mode selector to replace the options --standalone, --onefile, --module, and --macos-create-app-bundle.

    Note

    The app mode creates an app bundle on macOS and a onefile binary on other operating systems to provide the best deployment option for each platform.

  • Windows: Added a new hide choice for the --windows-console-mode option. This generates a console program that hides the console window as soon as possible, although it may still briefly flash.

  • UI: Added the --python-flag=-B option to disable the use of bytecode cache (.pyc) files during imports. This is mainly relevant for accelerated mode and dynamic imports in non-isolated standalone mode.

  • Modules: Enabled the generation of type stubs (.pyi files) for compiled modules using an inline copy of stubgen. This provides more accurate and informative type hints for compiled code.

    Note

    Nuitka also adds implicit imports to compiled extension modules, ensuring that dependencies are not hidden.

  • Plugins: Changed the data files configuration to a list of items, allowing the use of when conditions for more flexible control. Done in 2.4.6 already.

  • Onefile: Removed the MSVC requirement for the splash screen in onefile mode. It now works with MinGW64, Clang, and ClangCL. Done for 2.4.8 already.

  • Reports: Added information about the file system encoding used during compilation to help debug encoding issues.

  • Windows: Improved the attach mode for --windows-console-mode when forced redirects are used.

  • Distutils: Added the ability to disable Nuitka in pyproject.toml builds using the build_with_nuitka setting. This allows falling back to the standard build backend without modifying code or configuration. This setting can also be passed on the command line using --config-setting.

  • Distutils: Added support for commercial file embedding in distutils packages.

  • Linux: Added support for using uninstalled self-compiled Python installations on Linux.

  • Plugins: Enabled the matplotlib plugin to react to active Qt and tkinter plugins for backend selection.

  • Runtime: Added a new original_argv0 attribute to the __compiled__ value to provide access to the original start value of sys.argv[0], which might be needed by applications when Nuitka modifies it to an absolute path.

  • Reports: Added a list of DLLs that are actively excluded because they are located outside of the PyPI package.

  • Plugins: Allowed plugins to override the compilation mode for standard library modules when necessary.

Optimization
  • Performance: Implemented experimental support for “dual types”, which can significantly speed up integer operations in specific cases (achieving speedups of 12x or more in some very specific loops). This feature is still under development but shows promising potential for future performance gains, esp. when combined with future PGO (Profile Guided Optimization) work revealing likely runtime types more often and more types being covered.

  • Performance: Improved the speed of module variable access.

    • For Python 3.6 to 3.10, this optimization utilizes dictionary version tags but may be less effective when module variables are frequently written to.

    • For Python 3.11+, it relies on dictionary key versions, making it less susceptible to dictionary changes but potentially slightly slower for cache hits compared to Python 3.10.

  • Performance: Accelerated string dictionary lookups for Python 3.11+ by leveraging knowledge about the key and the module dictionary’s likely structure. This also resolves a previous TODO item, where initial 3.11 support was not as fast as our support for 3.10 was in this domain.

  • Performance: Optimized module dictionary updates to occur only when values actually change, improving caching efficiency.

  • Performance: Enhanced exception handling by removing bloat in the abstracted differences between Python 3.12 and earlier versions. This simplifies the generated C code, reduces conversions, and improves efficiency for all Python versions. This affects both C compile time and runtime performance favorably and solves a huge TODO for Python 3.12 performance.

  • Performance: Removed the use of CPython APIs calls for accessing exception context and cause values, which can be slow.

  • Performance: Utilized Nuitka’s own faster methods for creating int and long values, avoiding slower CPython API calls.

  • Performance: Implemented a custom variant of _PyGen_FetchStopIterationValue to avoid CPython API calls in generator handling, further improving performance on generators, coroutines and asyncgen.

  • Windows: Aligned with CPython’s change in reference counting implementation on Windows for Python 3.12+, which improves performance with LTO (Link Time Optimization) enabled.

  • Optimization: Expanded static optimization to include unary operations, improving the handling of number operations and preparing for full support of dual types.

  • Optimization: Added static optimization for os.stat and os.lstat calls.

  • Performance: Passed the exception state directly into unpacking functions, eliminating redundant exception fetching and improving code efficiency.

  • Performance: Introduced a dedicated helper for unpacking length checks, resulting in faster and more compact code helping scalability as well.

  • Performance: Generated more efficient code for raising built-in exceptions by directly creating them through the base exception’s new method instead of calling them as functions. This can speed up some things by a lot.

  • Performance: Optimized exception creation by avoiding unnecessary tuple allocations for empty exceptions. This hack avoids hitting the memory allocator as much.

  • Performance: Replaced remaining uses of PyTuple_Pack with Nuitka’s own helpers to avoid CPython API calls.

  • Code Generation: Replaced implicit exception raise nodes with direct exception creation nodes for improved C code generation.

  • Windows: Aligned with CPython’s change in managing object reference counters on Windows for Python 3.12+, improving performance with LTO enabled.

  • Performance: Removed remaining CPython API calls when creating int values in various parts of the code, including specialization code, helpers, and constants loading.

  • Windows: Avoided scanning for DLLs in the PATH environment variable when they are not intended to be used from the system. This prevents potential crashes related to non-encodable DLL paths and makes those scans faster too.

  • Windows: Updated to a newer MinGW64 version from 13.2 to 14.2 for potentially improved binary code generation with that compiler.

  • Code Size: Reduced the size of constant blobs by avoiding module-level constants for the global values -1, 0, and 1.

  • Code Generation: Improved code generation for variables by directly placing NameError exceptions into the thread state when raised, making for more compact C code.

  • Optimization: Statically optimized the sys.ps1 and sys.ps2 values to not exist (unless in module mode), potentially enabling more static optimization in packages that detect interactive usage checking them.

  • Performance: Limited the use of tqdm locking to no-GIL and Scons builds where threading is actively used.

  • Optimization: Implemented a faster check for non-frame statement sequences by decoupling frames and normal statement sequences and using dedicated accessors. This improves performance during the optimization phase.

Anti-Bloat
  • Prevented the inclusion of importlib_metadata for the numpy package. Added in 2.4.2 already.

  • Avoided the use of dask in the pandera package. Added in 2.4.5 already.

  • Removed numba for newer versions of the shap package. Added in 2.4.6 already.

  • Prevented attempts to include both Python 2 and Python 3 code for the aenum package, avoiding SyntaxError warnings. Added in 2.4.7 already.

  • Enhanced handling for the sympy package. Added in 2.4.7 already.

  • Allowed pydoc for the pyqtgraph package. Added in 2.4.7 already.

  • Avoided pytest in the time_machine package. Added in 2.4.9 already.

  • Avoided pytest in the anyio package.

  • Avoided numba in the pandas package.

  • Updated anti-bloat measures for newer versions of the torch package with increased coverage.

  • Avoided pygame.tests and cv2 for the pygame package.

  • Allowed unittest in the absl.testing package.

  • Allowed setuptools in the tufup package.

  • Avoided test modules when using the bsdiff4 package.

  • Treated the use of the wheel module the same as using the setuptools package.

Organizational
  • Development Environment: Added experimental support for a devcontainer to the repository, providing an easier way to set up a Linux-based development environment. This feature is still under development and may require further refinement.

  • Issue Reporting: Clarified the issue reporting process on GitHub, emphasizing the importance of testing reproducers against Python first to ensure the issue is related to Nuitka.

  • Issue Reporting: Discouraged the use of --deployment in issue reports, as it hinders the automatic identification of issues, that should be the first thing to remove.

  • UI: Improved the clarity of help message of the option for marking data files as external, emphasizing that files must be included before being used.

  • UI: Added checks to the Qt plugins to ensure that specified plugin families exist, preventing unnoticed errors.

  • UI: Implemented heuristic detection of terminal link support, paving the way for adding links to options and groups in the command line interface.

  • UI: Removed obsolete caching-related options from the help output, as they have been replaced by more general options.

  • Plugins: Improved error messages when retrieving information from packages during compilation.

  • Quality: Implemented a workaround for an isort bug that prevented it from handling UTF-8 comments.

  • Quality: Updated GitHub actions to use clang-format-20.

  • Quality: Updated to the latest version of black for code formatting.

  • Release Process: Updated the release script tests for Debian and PyPI to use the correct runner names. (Changed in 2.4.1 already.

  • UI: Disabled progress bar locking, as Nuitka currently doesn’t utilize threads.

  • UI: Added heuristic detection of terminal link support and introduced an experimental terminal link as a first step towards a more interactive command line interface.

  • Debugging: Fixed a crash in the “explain reference counts” feature that could occur with unusual dict values mistaken for modules.

  • Debugging: Included reference counts of tracebacks when dumping reference counts at program end.

  • Debugging: Added assertions and traces to improve debugging of input/output handling.

  • Quality: Added checks for configuration module names in Nuitka package configuration to catch errors caused by using filenames instead of module names.

  • UI: Removed obsolete options controlling cache behavior, directing users to the more general cache options.

  • Scons: Ensured that the CC environment variable is used consistently for --version and onefile bootstrap builds, as well as the Python build, preventing inconsistencies in compiler usage and outputs.

  • Distutils: Added the compiled-package-hidden-by-package mnemonic for use in distutils to handle the expected warning when a Python package is replaced with a compiled package and the Python code is yet to be deleted.

  • Dependency Management: Started experimental support for downloading Nuitka dependencies like ordered-set. This feature is not yet ready for general use.

Tests
  • Added Python 3.13 to the GitHub Actions test matrix.

  • Significantly enhanced construct-based tests for clearer results. The new approach executes code with a boolean flag instead of generating different code, potentially leading to the removal of custom templating.

  • Removed the 2to3 conversion code from the test suite, as it is being removed from newer Python versions. Tests are now split with version requirements as needed.

  • Fixed an issue where the test runner did not discover and use Python 3.12+, resulting in insufficient test coverage for those versions on GitHub Actions.

  • Ensured that the compare_with_cpython test function defaults to executing the system’s Python interpreter instead of relying on the PYTHON environment variable.

  • Set up continuous integration with Azure Pipelines to run Nuitka tests against the factory branch on each commit.

  • Enforced the use of static libpython for construct-based tests to eliminate DLL call overhead and provide more accurate performance measurements.

  • Improved the robustness of many construct tests, making them less sensitive to unrelated optimization changes.

  • Removed a test that was only applicable to Nuitka Commercial, as it was not useful to always skip it in the standard version. Commercial tests are now also recognized by their names.

  • Added handling for segmentation faults in distutils test cases, providing debug output for easier diagnosis of these failures.

  • Prevented resource warnings for unclosed files in a reflected test.

Cleanups
  • WASI: Corrected the signatures of C function getters and setters for compiled types in WASI to ensure they match the calling conventions. Casts are now performed locally to the compiled types instead of in the function signature. Call entries also have the correct signature used by Python C code.

  • WASI: Improved code cleanliness by adhering to PyCFunction signatures in WASI.

  • Code Generation: Fixed a regression in code generation that caused misaligned indentation in some cases.

  • Code Formatting: Changed some code for identical formatting with clang-format-20 to eliminate differences between the new and old versions.

  • Caching: Enforced proper indentation in Nuitka cache files stored in JSON format.

  • Code Cleanliness: Replaced checks for Python 3.4 or higher with checks for Python 3, simplifying the code and reflecting the fact that Python 3.3 is no longer supported.

  • Code Cleanliness: Removed remaining Python 3.3 specific code from frame templates.

  • Code Cleanliness: Performed numerous spelling corrections and renamed internal helper functions for consistency and clarity.

  • Plugins: Renamed the get_module_directory helper function in the Nuitka Package configuration to remove the leading underscore, improving readability.

  • Plugins: Moved the numexpr.cpuinfo workaround to the appropriate location in the Nuitka Package configuration, resolving an old TODO item.

Summary

This a major release that brings support for Python 3.13, relatively soon after its release.

Our plugin system and Nuitka plugin configuration was used a lot for support of many more third-party packages, and numerous other enhancements in the domain of avoiding bloat.

This release focuses on improved compatibility, new break through performance optimizations, to build on in the future, enhanced error reporting, and better debugging support.

Categories: FLOSS Project Planets

EuroPython Society: List of EPS Board Candidates for 2024/2025

Planet Python - Tue, 2024-11-26 17:27

At this year’s EuroPython Society General Assembly (GA), planned for Sunday, December 1st, 2024, 20:00 CET, we will vote in a new board of the EuroPython Society for the term 2024/2025

List of Board Candidates

The EPS bylaws require one chair, one vice chair and 2 - 7 board members. The following candidates have stated their willingness to work on the EPS board. We are presenting them here (in alphabetical order by first name).

We will be updating this list in the days before the GA. Please send in any nominations or self-nominations to board@europython.eu. For more information please check our previous post here: https://europython-society.org/2024-general-assembly-announcement/

Please note that our bylaws do not restrict nominations to people on this list. It is even possible to self-nominate or nominate other candidates at the GA itself. However, in the interest of giving members a better chance to review the candidate list, we’d like to encourage all nominations to be made before the GA.

The following fine folks have expressed their desire to run for the next EPS board elections: Anders Hammarquist, Aris Nivorils, Artur Czepiel, Cyril Bitterich, Mia Bajić, Shekhar Koirala.

Anders Hammarquist

Pythonista / Consultant / Software architect

Anders is running his own Python consultancy business, AB Struse, since 2019 and is currently mostly involved with using Python in industrial automation. He has been using Python since 1995, and fosters its use in at least four companies.

He helped organize EuroPython 2004 and 2005, and has attended and given talks at several EuroPythons since then. He has handled the Swedish financials of the EuroPython Society since 2016 and has served as board member since 2017.

Aris Nivorlis

Pythonista / Geoscientist / Data Steward

Aris is a researcher at Deltares, a non-profit research institute in the Netherlands, where he combines his expertise in geoscience with his passion for Python and data stewardship to address real-world challenges. His journey with Python started during his doctoral studies, where he became a passionate advocate for the language and supported several colleagues in adopting Python for their research.

Aris has been involved in the Python community for the past four years. He is the Chair of PyCon Sweden and has been a core organizer for the past three conferences. He was a EuroPython organizer in 2024, leading the Ops Team, an experience he describes as both challenging and incredibly rewarding.

Aris is running for the EuroPython Society (EPS) Board to work in shaping its future direction. He is particularly interested on how EPS can further support local Python communities, events, and projects, while ensuring the success of the EuroPython conference. Aris aims to build on the foundation of previous efforts, working toward a more independent and sustainable organisation team for EuroPython. One of his key goals is to lower the barriers for others to get involved as volunteers, organizers, and board members, fostering a more inclusive and accessible society.

Artur Czepiel (nomination for Chair)

Software developer

I started using Python in 2008 and attended my first EuroPython in 2016, and it’s been an incredible journey ever since. Over the years, I’ve had the opportunity to contribute to five EuroPython conferences, including serving four terms on the board and chairing the conference last year.

Despite that, I still have new ideas for improvements and a couple of unfinished projects I’d like to see moving forward. :)

Cyril Bitterich

Operations dude / Organiser / Systems Engineer

Cyril’s first contact with the EuroPython community was back in 2019 in Basel during a break between workshops. Starting with helping prepare goodie bags, he went on to assist with setting up the conference location, and took on the role of a general runner during the event. After becoming a late addition to the Ops team in Dublin, he was part of the Ops and Programme teams for both Prague editions of the conference. His firefighting skills proved invaluable as he supported other teams and the general organisation, making him a de facto member of the 2024 core organisers team.

Enjoying the warmth of the community and gaining experience in a lot of different roles, it&aposs now time for him to pass on the lessons he’s learned in a more structured way.  Having taken on smaller and ad-hoc leadership roles during the conference, he now aims to play an even more active role in the year-to-year operations of the EuroPython community.

With a background in successfully playing firefighter at organising capoeira events for 100 to 2000 attendees at locations all over the world (Asia and Antarctica are still on the to-do list 😉), and working with leaders, teams and attendees from diverse cultures, his knowledge and experience extend far beyond the last 3 years of EuroPython conference.

His goal is to bring more structure to the EPS as a whole and the conference teams where needed. A key focus is ensuring effective communication, be it through improving documentation or maintaining open and regular dialogue with the teams and everyone else involved Ultimately, he aims to help strike a healthy balance between the community-driven, volunteer spirit with the  professionalism expected from an established conference like EuroPython. However, the aim is not to disrupt or rebuild everything from scratch. Instead, he seeks to build on the strong foundation established by the board members active in the last years - while shamelessly taking full advantage of their counsel along the way.

This should establish a stable foundation within the EuroPython Society that can be adopted or directly used by other local communities in Europe. On the EPS side, this will hopefully open up resources to support other communities based on their specific needs.

Mia Bajić

Software Engineer & Community Events Organizer

I’m a software engineer and community events organizer. Since joining the Python community in 2021, I’ve led Pyvo meetups, brought Python Pizza to the Czech Republic, and contributed to PyCon CZ 23 as well as EuroPython 2023 & 2024. I&aposve spoken on technical topics at major conferences, including PyCon US, DjangoCon, EuroPython and many other PyCons across Europe.

I’m running for the board to learn, grow, and give back to the community that has given me so much. The main topic I would like to focus on is sustainability. I value setting clear goals, fostering open and transparent culture, and delivering measurable results.

Inspired by the folks from the Django Software Foundation, I decided to share more about my background, motivations, and small improvements I’d like to see implemented this year on my blog: https://clytaemnestra.github.io/tech-blog/eps-elections.

Shekhar Koirala

Machine Learning Engineer

I joined EuroPython as a remote volunteer in 2022 and later became part of the onsite volunteer team in Dublin the same year. Over the past two years, I’ve worked with the Ops team, supporting other onsite volunteers. My volunteering experience also includes contributions to Python Ireland and PyData Kathmandu. This year, I attended PyData Amsterdam, PyCon Sweden, and a non-Python volunteer-led conference in Berlin, Germany, where I sought to understand what truly makes a conference great. I realized that, at the core, it’s the people and their love for the community that make a conference exceptional.

Whether or not I join the board, I will continue volunteering for EuroPython. But if I am selected as a board member, I aim to support volunteers and foster a wholesome environment, just as I have always experienced. I also want to strengthen the connection between EuroPython and the local community.

Professionally, I work as a Machine Learning Engineer at Identv, using Python for both work and hobbies. Recently, I delivered a talk and conducted a workshop at PyCon Ireland, and I’m excited to kickstart my open-source journey. Beside sitting in front of the computer, I love to hike, take photos and try not to get lost in the wild.

What does the EPS Board do ?

The EPS board is made up of up to 9 directors (including 1 chair and 1 vice chair); the board runs the day-to-day business of the EuroPython Society, including running the EuroPython conference series, and supports the community through various initiatives such as our grants programme. The board collectively takes up the fiscal and legal responsibility of the Society.

For more details you can check our previous post here: https://europython-society.org/2024-general-assembly-announcement/#what-does-the-board-do

Categories: FLOSS Project Planets

Dries Buytaert: Drupal 11 released

Planet Drupal - Tue, 2024-11-26 15:29

Today is a big day for Drupal as we officially released Drupal 11!

In recent years, we've seen an uptick in innovation in Drupal. Drupal 11 continues this trend with many new and exciting features. You can see an overview of these improvements in the video below:

Drupal 11 has been out for less than six hours, and updating my personal site was my first order of business this morning. I couldn't wait! Dri.es is now running on Drupal 11.

I'm particularly excited about two key features in this release, which I believe are transformative and will likely reshape Drupal in the years ahead:

  1. Recipes (experimental): This feature allows you to add new features to your website by applying a set of predefined configurations.
  2. Single-Directory Components: SDCs simplify front-end development by providing a component-based workflow where all necessary code for each component lives in a single, self-contained directory.

These two new features represent a major shift in how developers and site builders will work with Drupal, setting the stage for even greater improvements in future releases. For example, we'll rely heavily on them in Drupal Starshot.

Drupal 11 is the result of contributions from 1,858 individuals across 590 organizations. These numbers show how strong and healthy Drupal is. Community involvement remains one of Drupal's greatest strengths. Thank you to everyone who contributed to Drupal 11!

Categories: FLOSS Project Planets

PyCoder’s Weekly: Issue #657 (Nov. 26, 2024)

Planet Python - Tue, 2024-11-26 14:30

#657 – NOVEMBER 26, 2024
View in Browser »

NumPy Practical Examples: Useful Techniques

In this tutorial, you’ll learn how to use NumPy by exploring several interesting examples. You’ll read data from a file into an array and analyze structured arrays to perform a reconciliation. You’ll also learn how to quickly chart an analysis and turn a custom function into a vectorized function.
REAL PYTHON

Loop Targets

Loop assignment allows you to assign to a dict item in a for loop. This post covers what that means and that it is no more costly than regular assignment.
NED BATCHELDER

Introducing the Windsurf Editor. Tomorrow’s IDE, Today

Introducing the Windsurf Editor, the first agentic IDE. All the features you know and love from Codeium’s extensions plus new capabilities such as Cascade that act as collaborative AI agents, combining the best of copilot and agent systems →
CODEIUM sponsor

Vector Animations With Python

This post shows you how to use gizeh and moviepy to create animations using vector graphics.
DEEPNOTE.COM

Take the 2024 Django Developers Survey

DJANGO SOFTWARE FOUNDATION

Python 3.14.0 Alpha 2 Released

CPYTHON DEV BLOG

Quiz: Python Dictionary Comprehensions

REAL PYTHON

PyTexas (Austin) Call for Proposals

PYTEXAS.ORG

Quiz: Namespaces and Scope in Python

REAL PYTHON

Articles & Tutorials Avoid Counting in Django Pagination

Django’s Paginator class separates data into chunks to display pages of results. By default, underneath it uses a SQL COUNT(*) call which for large amounts of data can be expensive. This article shows you how to get around that.
NIK TOMAZIC

Working With TOML and Python

TOML is a configuration file format that’s becoming increasingly popular in the Python community. In this video course, you’ll learn the syntax of TOML and explore how you can work with TOML files in your own projects.
REAL PYTHON course

Categories of Leadership on Technical Teams

Understanding the different types of technical leadership can help you better structure teams and define roles and responsibilities. This post talks about the various types of technical leaders you might encounter.
BEN KUHN

Quick Prototyping With Sqlite3

Python’s sqlite3 bindings makes it a great tool for quick prototyping while you’re working out what it is exactly that you’re building and what kind of database schema makes sense.
JUHA-MATTI SANTALA

Django Performance and Optimization

“This document provides an overview of techniques and tools that can help get your Django code running more efficiently - faster, and using fewer system resources.”
DJANGO SOFTWARE FOUNDATION

Python Dependency Management Is a Dumpster Fire

Managing dependencies in Python can be a bit of a challenge. This deep dive article shows you all the problems and how the problems are mitigated if not solved.
NIELS CAUTAERTS

Under the Microscope: Ecco the Dolphin

This article shows the use of Ghidra and Python to reverse engineer the encoding scheme of the Sega Dreamcast game Eccho the Dolphin.
32BITS

Is Python Really That Slow?

The speed of Python is an age old conversation. This post talks about just what it does and doesn’t mean.
MIGUEL GRINBERG

Threads Beat Async/Await

Further musings about async & await and why Armin thinks virtual threads are a better model.
ARMIN RONACHER

Python for R Users

Resources for getting better at Python for experienced R developers
STEPHEN TURNER

Projects & Code nbformat: Base Implementation of Jupyter Notebook Format

PYPI.ORG

pex: Generate Python EXecutable Files, Lock Files and Venvs

GITHUB.COM/PEX-TOOL

Async, Pure-Python Server-Side Rendering Engine

VOLFPETER.GITHUB.IO • Shared by Peter Volf

django-tasks: Background Workers Reference Implementation

GITHUB.COM/REALORANGEONE

dsRAG: Retrieval Engine for Unstructured Data

GITHUB.COM/D-STAR-AI

Events Weekly Real Python Office Hours Q&A (Virtual)

November 27, 2024
REALPYTHON.COM

Python New Zealand: Mapping Hurricane Wind Speeds From Space

November 28, 2024
MEETUP.COM

SPb Python Drinkup

November 28, 2024
MEETUP.COM

PyCon Wroclaw 2024

November 30 to December 1, 2024
PYCONWROCLAW.COM

PyDay Togo 2024

November 30 to December 1, 2024
LU.MA

PyDelhi User Group Meetup

November 30, 2024
MEETUP.COM

Happy Pythoning!
This was PyCoder’s Weekly Issue #657.
View in Browser »

[ Subscribe to 🐍 PyCoder’s Weekly 💌 – Get the best Python news, articles, and tutorials delivered to your inbox once a week >> Click here to learn more ]

Categories: FLOSS Project Planets

MidCamp - Midwest Drupal Camp: Session Submission Now Open for MidCamp 2025!

Planet Drupal - Tue, 2024-11-26 11:25
Session Submission Now Open for MidCamp 2025!

As the season of gratitude approaches, we’re excited to celebrate you—our future speakers! If you've got an idea for a session, now's the time to get involved in MidCamp 2025, happening May 20-22 in Chicago.

Call for Speakers

Since 2014, MidCamp has hosted over 300 amazing sessions, and we’re ready to add your talk to that legacy. We’re seeking presentations for all skill levels—from Drupal beginners to advanced users, as well as end users and business professionals. Check out our session tracks for full details on the types of talks we’re looking for.

Not quite ready? No worries! Join us for one of our Speaker Workshops:

  • December 2024 (TBD): Crafting an Outstanding Proposal
  • March 2025 (TBD): Polishing Your Presentation (Open to both MidCamp and DrupalCon Atlanta 2025 presenters!)
Key Dates
  • Session Proposals Open: November 25, 2024

Submit your session now!

  • Proposal Deadline: January 12, 2025
  • Speakers Notified: Week of February 17, 2025
  • MidCamp Sessions: May 20-21, 2025
P.S.

We hear Florida is also calling for submissions—but let’s be real, we know where your heart lies. 😊

Sponsor MidCamp

Looking to connect with the Drupal community? Sponsoring MidCamp is the way to do it! With packages starting at $600, there are opportunities to suit any budget. Whether you’re recruiting talent, growing your brand, or simply supporting the Drupal ecosystem, MidCamp sponsorship offers great value.

Act early to maximize your exposure!

Learn more about sponsorship opportunities

Stay in the Loop

Don’t miss a beat!

Keep an eye on our news page and subscribe to our newsletter for updates on the venue, travel options, social events, and speaker announcements.

Ready to submit your session? Let’s make MidCamp 2025 unforgettable!

Categories: FLOSS Project Planets

Web Search Keywords

Planet KDE - Tue, 2024-11-26 10:32

Did you know about a small but very useful feature from KDE?

Open krunner via Alt+Space and type qw:KDE to search Qwant for KDE:

Pressing Enter will open up your browser with the specified KDE search on Qwant!

There are a lot of other Web Search Keywords like:

Wikipedia:

Invent:

Translate from English to German on dict.cc

You can find all of them by opening Web Search Keywords on krunner:

Extra: Create your own!

I use often a Fedora tool called COPR, so let’s use it as an example to create our own web search keyword.

Do your search in the webpage:

And now the important part you need:

Now back to the Web Search Keyword settings:

Fill in the data needed taking care of the placeholder for our input!:

Now we have our own Web Search Keyword:

NOTE: for some reason I had to click on Apply and OK until all the different setting windows were closed before the new custom Web Search Keyword worked

And the result:

I hope it’s useful to somebody!

Categories: FLOSS Project Planets

Matt Glaman: Restrict Composer dependency updates to only patch releases

Planet Drupal - Tue, 2024-11-26 09:00

I was doing website maintenance and checked for outdated dependencies with composer outdated. I usually filter with -D for checking direct dependencies and -p for packages with patch releases. These are typically easy pickings. I saw I was on 2.1.3 of the Honeypot module and 2.1.4 was available. So I ran composer update drupal/honeypot. I noticed the module was updated to 2.2.0, because my Composer constraint is drupal/honeypot: ^2.0, allowing minor updates. I figured that was fine enough. Turns out it wasn't.

Categories: FLOSS Project Planets

Real Python: Managing Dependencies With Python Poetry

Planet Python - Tue, 2024-11-26 09:00

When your Python project relies on external packages, you need to make sure you’re using the right version of each package. After an update, a package might not work as it did before. A dependency manager like Python Poetry helps you specify, install, and resolve external packages in your projects. This way, you can be sure that you always work with the correct dependency version on every machine.

In this video course, you’ll learn how to:

  • Create a new project using Poetry
  • Add Poetry to an existing project
  • Configure your project through pyproject.toml
  • Pin your project’s dependency versions
  • Install dependencies from a poetry.lock file
  • Run basic Poetry commands using the Poetry CLI

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Highlights from the Digital Public Goods Alliance Annual Members Meeting 2024

Open Source Initiative - Tue, 2024-11-26 06:45

This month, I had the privilege of representing the Open Source Initiative at the Digital Public Goods Alliance (DPGA) Annual Members Meeting. Held in Singapore, this event marked our second year participating as members, following our first participation in Ethiopia. It was an inspiring gathering of innovators, developers and advocates working to create a thriving ecosystem for Digital Public Goods (DPGs) as part of UNICEF’s initiative to advance the United Nations’ sustainable development goals (SDGs).

Keynote insights

The conference began with an inspiring keynote by Liv Marte Nordhaug, CEO of the DPGA, who made a call for governments and organizations to incorporate:

  1. Open Source first principles
  2. Open data at scale
  3. Interoperable Digital Public Infrastructure (DPI)
  4. DPGs as catalysts for climate change action

These priorities underscored the critical role of DPGs in fostering transparency, accountability and innovation across sectors.

DPGs and Open Source AI

A standout feature of the first day of the event was the Open Source AI track led by Amreen Taneja, DPGA Standards Lead, which encompassed three dynamic sessions:

  1. Toward AI Democratization with Digital Public Goods: This session explored the role of DPGs in contributing to the democratization of AI technologies, including AI use, development and governance, to ensure that everyone benefits from AI technology equally.
  2. Fully Open Public Interest AI with Open Data: This session highlighted the need for open AI infrastructure supported by accessible, high-quality datasets, especially in the global majority countries. Discussions evolved over how open training data sets ought to be licensed to ensure open, public interest AI.
  3. Creating Public Value with Public AI: This session examined real-world applications of generative AI in public services. Governments and NGOs showcased how AI-enabled tools can effectively tackle social challenges, leveraging Open Source solutions within the AI stack.
DPG Product Fair: showcasing innovation

The second day of the event was marked by the DPG Product Fair, which provided a science-fair-style platform for showcasing DPGs. Notable examples included:

  • India’s eGov DIGIT Open Source platform, serving over 1 billion citizens with robust digital infrastructure.
  • Singapore’s Open Government Products, which leverages Open Source and enables open collaboration to allow this small nation to expand their impact together with other southeast Asian nations.

One particularly engaging session was Sarah Espaldon’s “Improving Government Services with Legos.” This presentation from the Singapore government highlighted the benefits of modular DPGs in enhancing service delivery and building flexible DPI capabilities.

Privacy best practices for DPGs

A highlight from the third and final day of the event was the privacy-focused workshop co-hosted by the DPGA and the Open Knowledge Foundation. As privacy becomes a central concern for DPGs, the DPGA introduced its Standard Expert Group to refine privacy requirements and develop best practice guidelines. The interactive session provided invaluable feedback, driving forward the development of robust privacy standards for DPGs.

Looking ahead

The event reaffirmed the potential of Open Source technologies to transform global public goods. As we move forward, the Open Source Initiative is committed to advancing these conversations, including around the Open Source AI Definition and fostering a more inclusive digital ecosystem. A special thanks to the dedicated DPGA team—Liv Marte Nordhaug, Lucy Harris, Max Kintisch, Ricardo Miron, Luciana Amighini, Bolaji Ayodeji, Lea Gimpel, Pelin Hizal Smines, Jon Lloyd, Carol Matos, Amreen Taneja, and Jameson Voisin—whose efforts made this conference a success. We look forward to another year of impactful collaboration and to the Annual Members Meeting next year in Brazil!

Categories: FLOSS Research

Emanuele Rocca: Building Debian packages The Right Way

Planet Debian - Tue, 2024-11-26 03:34

There is more than one way to do it, but it seems that The Right Way to build Debian packages today is using sbuild with the unshare backend. The most common backend before the rise of unshare was schroot.

The official Debian Build Daemons have recently transitioned to using sbuild with unshare, providing a strong motivation to consider making the switch. Additionally the new approach means: (1) no need to configure schroot, and (2) no need to run the build as root.

Here are my notes about moving to the new setup, for future reference and in case they may be useful to others.

First I installed the required packages:

apt install sbuild mmdebstrap uidmap

Then I created the following script to update my chroots every night:

#!/bin/bash for arch in arm64 armhf armel; do HOME=/tmp mmdebstrap --quiet --arch=$arch --include=ca-certificates --variant=buildd unstable \ ~/.cache/sbuild/unstable-$arch.tar http://127.0.0.1:3142/debian done

In the script, I’m calling mmdebstrap with --quiet because I don’t want to get any output on succesful execution. The script is running in cron with email notifications, and I only want to get a message if something goes south. I’m setting HOME=/tmp for a similar reason: the unshare user does not have access to my actual home directory, and by default dpkg tries to use $HOME/.dpkg.cfg as the configuration file. By overriding HOME I avoid the following message to standard error:

dpkg: warning: failed to open configuration file '/home/ema/.dpkg.cfg' for reading: Permission denied

Then I added the following to my sbuild configuration file (~/.sbuildrc):

$chroot_mode = 'unshare'; $unshare_tmpdir_template = '/dev/shm/tmp.sbuild.XXXXXXXXXX';

The first option sets the sbuild backend to unshare, whereas unshare_tmpdir_template is needed on Bookworm to ensure that the build process runs in memory rather than on disk for performance reasons. Starting with Trixie, /tmp is by default a tmpfs so the setting won’t be needed anymore.

Packages for different architectures can now be built as follows:

# Tarball used: ~/.cache/sbuild/unstable-arm64.tar $ sbuild --dist=unstable hello # Tarball used: ~/.cache/sbuild/unstable-armhf.tar $ sbuild --dist=unstable --arch=armhf hello

If you have any comments or suggestions about any of this, please let me know.

Categories: FLOSS Project Planets

Specbee: The All-New Drupal CMS: Find Answers to Your 13 Most Common Questions

Planet Drupal - Tue, 2024-11-26 01:25
We know you love FAQs! We'll answer your questions you have about the all-new Drupal CMS (yep, 13 of them). From features to migration, find out everything you need to know about this groundbreaking new CMS.
Categories: FLOSS Project Planets

Armin Ronacher: Constraints are Good: Python's Metadata Dilemma

Planet Python - Mon, 2024-11-25 19:00

There is currently an effort underway to build a new universal lockfile standard for Python, most of which is taking place on the Python discussion forum. This initiative has highlighted the difficulty of creating a standard that satisfies everyone. It has become clear that different Python packaging tools are having slightly different ideas in mind of what a lockfile is supposed to look like or even be used for.

In those discussions however also a small other aspect re-emerged: Python has a metadata problem. Python's metadata system is too complex and suffers from what I would call “lack of constraints”.

JavaScript: Example of Useful Constraints

JavaScript provides an excellent example of how constraints can simplify and improve a system. In JavaScript, metadata is straightforward. Whether you develop against a package locally or if you are using a package from npm, metadata represents itself the same way. There is a single package.json file that contains the most important metadata of a package such as name, version or dependencies. This simplicity imposes significant but beneficial constraints:

  • There is a 1:1 relationship between an npm package and its metadata. Every npm package has a single package.json file that is the source of truth of metadata. Metadata is trivially accessible, even programmatically, via require('packageName/package.json').
  • Dependencies (and all other metadata) are consistent across platforms and architectures. Platform-specific binaries are handled via a filter mechanism (os and cpu) paired with optionalDependencies. [1]
  • All metadata is static, and updates require explicit changes to package.json prior to distribution or installation. Tools are provided to manipulate that metadata such as npm version patch which will edit the file in-place.

These constraints offer several benefits:

  • Uniform behavior regardless of whether a dependency is installed locally or from a remote source. There is not even a filesystem layout difference between what comes from git or npm. This enables things like replacing an installed dependency with a local development copy, without change in functionality.
  • There is one singular source of truth for all metadata. You can edit package.json and any consumer of that metadata can just monitor that file for changes. No complex external information needs to be consulted.
  • Resolvers can rely on a single API call to fetch dependency metadata for a version, improving efficiency. Practically this also means that the resolver only needs to hit a single URL to retrieve all possible dependencies of a dependency. [2]
  • It makes auditing much easier because there are fewer moving parts and just one canonical location for metadata.
Python: The Cost of Too Few Constraints

In contrast, Python has historically placed very few constraints on metadata. For example, the old setup.py based build system essentially allowed arbitrary code execution during the build process. At one point it was at least strongly suggested that the version produced by that build step better match what is uploaded to PyPI. However, in practice, if you lie about the version that is okay too. You could upload a source distribution to PyPI that claims it's 2.0 but will in fact install 2.0+somethinghere or a completely different version entirely.

What happens is that both before a package is published to PyPI and when a package is installed locally after downloading, the metadata is generated from scratch. Not only does that mean the metadata does not have to match, it also means that it's allowed to be completely different. It's absolutely okay for a package to claim it depends on cool-dependency on your machine, but on uncool-dependency on my machine. Or to dependent on different packages depending on the time of the day of the phase of the moon.

Editable installs and caching are particularly problematic since metadata could become invalid almost immediately after being written. [3]

Some of this has been somewhat improved because the new pyproject.toml standard encourages static metadata. However build systems are entirely allowed to override that by falling back to what is called “dynamic metadata” and this is something that is commonly done.

In practice this system incurs a tremendous tax to everybody that can be easily missed.

  • Disjointed and complex metadata access: there is no clear relationship of PyPI package name and the installed Python modules. If you know what the PyPI package name is, you can access metadata via importlib.metadata. Metadata is not read from pyproject.toml, even if it's static, instead it takes the package name and it accesses the metadata from the .dist-info folder (most specifically the METADATA file therein) installed into site-packages.

  • Mandatory metadata re-generation: As a consequence if you edit pyproject.toml to edit a piece of metadata, you need to re-install the package for that metadata to be updated in the .dist-info. People commonly forget doing that, so desynchronized metadata is very common. This is true even for static metadata today!

  • Unclear cache invalidation: Because metadata can be dynamic, it's not clear when you should automatically re-install a package. It's not enough to just track pyproject.toml for changes when dynamic metadata is used. uv for instance has a really complex, explicit cache management system so one can help uv detect outdated metadata. This obviously is non-standardized, requires uv to understand version control systems and is also not shared with other tools. For instance if you know that the version information incorporates the git hash, you can tell uv to pay attention to git commits.

  • Fragmented metadata storage: even where generated metadata is stored is complex. Different systems have slightly different behavior for storing that metadata.

    • When working locally (eg: editable installs) what happens depends on the build system:
      • If setuptools is used, metadata written into two locations. The legacy <PACKAGE_NAME>.egg-info/PKG-INFO file. Additionally it's placed in the new location for metadata inside site-packages in a <PACKAGE_NAME>.dist-info/METADATA file.
      • If hatch and most other modern build systems are used, metadata is only written into site-packages. (into <PACKAGE_NAME>.dist-info/METADATA)
      • If no build system is configured it depends a bit on the installer. pip will even for an editable install build a wheel with setuptools, uv will only build a wheel and make the metadata available if one runs uv build. Otherwise the metadata is not available (in theory it could be found in pyproject.toml for as long as it's not dynamic).
    • For source distributions (sdist) first the build step happens as in the section before. Afterwards the metadata is thrown into a PKG-INFO file. It's currently placed in two locations in the sdist: PKG-INFO in the root and <PACKAGE_NAME>.egg-info/PKG-INFO. That metadata however I believe is only used for PyPI, when installing the sdist locally the metadata is regenerated from pyproject.toml (or if setuptools is used setup.py). That's also why metadata can change from what's in the sdist to what's there after installation.
    • For wheels the metadata is placed in <PACKAGE_NAME>.dist-info/METADATA exclusively. Wheels have static metadata, so no build step is taking place. What is in the wheel is always used.
  • Dynamic metadata makes resolvers slow: Dynamic metadata makes the job of resolvers and installers very hard and slows them down. Today for instance advanced resolvers like poetry or uv sometimes are not able to install the right packages, because they assume that dependency metadata is consistent across sdists and wheels. However there are a lot of sdists available on PyPI that publish incomplete dependency metadata (just whatever the build step for the sdist created on the developer's machine is what is cached on PyPI).

    Not getting this right can be the difference of hitting one static URL with all the metadata, and downloading a zip file, creating a virtualenv, installing build dependencies, generating an entire sdist and then reading the final generated metadata. Many orders of magnitude difference in time it takes to execute.

    This also extends to caching. If the metadata can constantly change, how would a resolver cache it? Is it required to build all possible source distributions to determine the metadata as part of resolving?

  • Cognitive complexity: The system introduces an enormous cognitive overhead which makes it very hard to understand for users, particularly when things to wrong. Incorrectly cached metadata can be almost impossible to debug for a user because they do not understand what is going on. Their pyproject.toml shows the right information, yet for some reason it behaves incorrectly. Most people don't know what "egg info" or "dist info" is. Or why an sdist has metadata in a different location than a wheel or a local checkout.

    Having support for dynamic metadata also means that developers continue to maintain elaborate and confusing systems. For instance there is a plugin for hatch that dynamically creates a readme [4], requiring even arbitrary Python code to run to display documentation. There are plugins to automatically change versions to incorporate git version hashes. As a result to figure out what version you actually have installed it's not just enough to look into a single file, you might have to rely on a tool to tell you what's going on.

Moving The Cheese

The challenge with dynamic metadata in Python is vast, but unless you are writing a resolver or packaging tool, you're not going to experience the pain as much. You might in fact quite enjoy the power of dynamic metadata. Unsurprisingly bringing up the idea to remove it is very badly received. There are so many workflows seemingly relying on it.

At this point fixing this problem might be really hard because it's a social problem more than a technical one. If the constraint would have been placed there in the first place, these weird use cases would never have emerged. But because the constraints were not there, people were free to go to town with leveraging it with all the consequences it causes.

I think at this point it's worth moving the cheese, but it's unclear if this can be done through a standard. Maybe the solution will be for tools like uv or poetry to warn if dynamic metadata is used and strongly discourage it. Then over time the users of packages that use dynamic metadata will start to urge the package authors to stop using it.

The cost of dynamic metadata is real, but it's felt only in small ways all the time. You notice it a bit when your resolver is slower than it has to, you notice it if your packaging tool installs the wrong dependency, you notice it if you need to read the manual for the first time when you need to reconfigure your cache-key or force a package to constantly reinstall, you notice it if you need to re-install your local dependencies over and over for them not to break. There are many ways you notice it. You don't notice it as a roadblock, just as a tiny, tiny tax. Except that is a tax we all pay and it makes the user experience significantly worse compared to what it could be.

The deeper lesson here is that if you give developers too much flexibility, they will inevitably push the boundaries and that can have significant downsides as we can see. Because Python's packaging ecosystem lacked constraints from the start, imposing them now has become a daunting challenge. Meanwhile, other ecosystems, like JavaScript's, took a more structured approach early on, avoiding many of these pitfalls entirely.

[1]You can see how this works in action for sentry-cli for instance. The @sentry/cli package declares all its platform specific dependencies as optionalDependencies (relevant package.json). Each platform build has a filter in its package.json for os and cpu. For instance this is what the arm64 linux binary dependency looks like: package.json. npm will attempt to install all optional dependencies, but it will skip over the ones that are not compatible with the current platform. [2]For @sentry/cli at version 2.39.0 for instance this means that this singular URL will return all the information that a resolver needs: registry.npmjs.org/@sentry/cli/2.39.0 [3]A common error in the past was to receive a pkg_resources.DistributionNotFound exception when trying to run a script in local development [4]I got some flak on Bluesky for throwing readme generators under the bus. While they do not present the same problem when it comes to metadata like dependencies and versions do, they do still increase the complexity. In an ideal world what you find in site-packages represents what you have in your version control and there is a README.md file right there. That's what you have in JavaScript, Rust and plenty of other ecosystems. What we have however is a build step (either dynamic or copying) taking that readme file, and placing it in a RFC 5322 header encoded file in a dist info. So instead of "command clicking" on a dependency and finding the readme, we need special tools or arcane knowledge if we want to read the readme files locally.
Categories: FLOSS Project Planets

Sandro Knauß: Akademy 2024 in Würzburg

Planet Debian - Mon, 2024-11-25 19:00

In order to prepare for the Akademy I started some days before to give my Librem 5 ( an Open Hardware Phone) another try and ended up with a non starting Plasma 6. Actually this issue was known already, but hasn't been addressed. In the end I reached the Akademy with my Librem 5 having phosh installed (which is Gnome based), in order to have something working.

I met Bushan and Bart who took care and the issue was fixed two days later I could finally install Plasma 6 on it. The last time I tested my Librem 5 with Plasma 5 it felt sluggish and not well working. But this time I was impressed how well the system reacts. Sure there are some things here and there, but in the bigger picture it is quite useable. One annoying issue is that the camera is only working with one app and the other issue is the battery capacity, you have to charge it once a day. Because of missing a QR reader that can use the camera, getting data to the phone was quite challenging. Unfortunately the conference Wifi separated the devices and I couldn't use KDE Connect to transfer data. In the end the only way to import data was taking five photos from the QR Code to import my D-Ticket to Itinerary.

With a device with Plasma Mobile, it directly was used for a experiment: How well does Dolphin works on a Plasma Mobile device. Together with Felix Ernst we tried it out and were quite impressed, that Dolphin does work very well on Plasma Mobile, after some simple modifications on the UI. That resulted in a patch to add a mobile UI for Dolphin !826.

With more time to play with my Librem 5 I also found an bug in KWeather, that is missing a Refresh option, when used in a Plasma Mobile environment #493656.

Akademy is a good place to identify and solve some issues. It is always like that, you chat with someone and they can tell you who to ask to answer the concrete question and in the end you can solve things, that seems unsolvable in the beginning.

There was also time to look into the travelling app Itinerary. A lot people are faced with a lot of real world issues, when not in their home town. Itinerary is the best traveling apps I know about. It can import nearly every ticket you have and can get location information from restaurant websites and allow routing to that place. It does add many useful information, while traveling like current delays, platform changes, live updates for elevator, weather information at the destination, a station map and all those features with strong focus on privacy.

In detail I found some small things to improve:

  • If you search for a bus ride and enter the correct name for the bus stop, it will still add some walk from and to the station. The issue here is that we use different backends and not all backends share the same geo coordinate. That's why Itinerary needs to add some heuristics to delete those paths.

  • Instead of displaying just a small station map of one bus stop in the inner city, it showed complete Würzburg inner city, as there is one big park around the inner city (named "Ringpark").

  • Würzburg has a quite big bus station but the platform information were missing in the map, so we tweaked the CSS to display the platform. To be sure, that we don't fix only Würzburg, we also looked at Greifswald and Aix-en-Provence if they are following the same name scheme.

I additionally learned that it has a lot of details that helps people who have special needs. That is the reason why Daniel Kraut wants to get Itinerary available for iOS. As spoken out, that Daniel wants to reach this goal, others already started to implement the first steps to build apps for iOS.

This year I was volunteering in helping out at Akademy. For me it was a lot of fun to meet everyone at the infodesk or help the speakers setup the beamer and microphone. It is also a good opportunity to meet many new faces and get in contact with them. I see also room for improvement. As we were quite busy at the Welcome Event to get out the badges to everyone, I couldn't answer the questions from newcomers, as the queue was too long. I propose that some people volunteer to be available for questions from newcomers. Often it is hard for newcomers to get their first contact(s) in a new community. There is a lot of space for improvement to make it easier for newcomers to join. Some ideas in my head are: Make an event for the newcomers to get them some links into the community and show that everyone is friendly. The tables at the BoFs should make a circle, so everyone can see each other. It was also hard for me to understand everyone as they mostly spoken towards the front. And then BoFs are sometimes full of very specific words and if you are not already deep in the topic you are lost. I can see the problem, on the one side BoFs are also the place where the person that knows the topic already wants to get things done. On the other side new comers join BoFs, are overwhelmed by to many new words get frustrated and think, that they are not welcome. Maybe at least everyone should present itself with name and ask new faces, why they joined the BoF to help them joining.

I'm happy, that the food provided for the attendees was very delicious and that I'm not the only one mostly vegetarian with a big amount to be vegan.

At the conference the KDE Eco initiation really caught me, as I see a lot of new possibilities in giving more reasons to switch to an Open Source system. The talk from Natalie was great to see how pupils get excited about Open Source and also help their grandparents to move to a Linux system. As I also will start to work as a teacher, I really got ideas what I can do at school. Together with Joseph and Nicole, we finally started to think about how to drive an exploration on what kind of old hardware is still KDE software running. The ones with the oldest hardware will get an old KDE shirt. For more information see #40.

The conference was very motivating for me, I also had still energy at the evening to do some Debian packaging and finally pushed kweathercore to Debian and started to work on KWeather. Now I'm even more interested in the KDE apps focusing the mobile world, as I now have some hardware that can actually use those apps.

I really enjoyed the workshop how to contribute to Qt by Volker Hilsheimer, especially the way how Volker explained things in a very friendly way, answered every question, sometime postponed some questions but came back to them later. All in all I now have a good overview how Qt is doing development and how I can fix bugs.

The daytrip to Rothenburg ob der Tauber was very interesting for me. It was the first time I visited the village. But in my memory it feels like I know the village already. I grew up with reading a lot of comic albums including the good SiFi comic album series "Yoku Tsuno" created by the Belgian writer Roger Leloup. Yoku Tsuno is an electronics engineer, raised in Japan but now living in Belgium. In "On the edge of life" she helps her friend Ingard, who actually lives in Rothenburg. Leloup invested a lot of time to travel to make the make his drawings as accurate as possible.

In order to not have a hard cut from Akademy to normal life, I had a lunch with Carlos, to discuss KDE Neon and how we can improve the interaction with Debian. In the future this should have less friction and make both communities work together more smoothly. Additionally as I used to develop on KDEPIM with the help of Docker images based on Neon I ask for a meta kf6 dev meta package. That should help to get rid of most hand written lists of dev packages in the Docker file in order to make it more simple for new contributors to start hacking on KDEPIM.

The rest of the day I finally found time to do the normal tourist stuff: Going to the Wine bridge and having a walk to the castle of Würzburg. Unfortunately you hear a lot of car noises up there, but I could finally relaxe in a Japanese designed garden.

Finally at Saturday I started my trip back. The trains towards Eberswalde are broken and I needed to find alternative routing. I got a little bit nervous, as it was the first time I travelled with my Librem 5 and Itinerary only and needed to reach the next train in less than two mins. With the indoor maps provided, I could prepare my run through the train station so I reached successfully my next train.

By the way, also if you only only use KDE software, I would recommend everyone to join Akademy ;)

Categories: FLOSS Project Planets

Akademy 2024 in Würzburg

Planet KDE - Mon, 2024-11-25 19:00

In order to prepare for the Akademy I started some days before to give my Librem 5 ( an Open Hardware Phone) another try and ended up with a non starting Plasma 6. Actually this issue was known already, but hasn't been addressed. In the end I reached the Akademy with my Librem 5 having phosh installed (which is Gnome based), in order to have something working.

I met Bushan and Bart who took care and the issue was fixed two days later I could finally install Plasma 6 on it. The last time I tested my Librem 5 with Plasma 5 it felt sluggish and not well working. But this time I was impressed how well the system reacts. Sure there are some things here and there, but in the bigger picture it is quite useable. One annoying issue is that the camera is only working with one app and the other issue is the battery capacity, you have to charge it once a day. Because of missing a QR reader that can use the camera, getting data to the phone was quite challenging. Unfortunately the conference Wifi separated the devices and I couldn't use KDE Connect to transfer data. In the end the only way to import data was taking five photos from the QR Code to import my D-Ticket to Itinerary.

With a device with Plasma Mobile, it directly was used for a experiment: How well does Dolphin works on a Plasma Mobile device. Together with Felix Ernst we tried it out and were quite impressed, that Dolphin does work very well on Plasma Mobile, after some simple modifications on the UI. That resulted in a patch to add a mobile UI for Dolphin !826.

With more time to play with my Librem 5 I also found an bug in KWeather, that is missing a Refresh option, when used in a Plasma Mobile environment #493656.

Akademy is a good place to identify and solve some issues. It is always like that, you chat with someone and they can tell you who to ask to answer the concrete question and in the end you can solve things, that seems unsolvable in the beginning.

There was also time to look into the travelling app Itinerary. A lot people are faced with a lot of real world issues, when not in their home town. Itinerary is the best traveling apps I know about. It can import nearly every ticket you have and can get location information from restaurant websites and allow routing to that place. It does add many useful information, while traveling like current delays, platform changes, live updates for elevator, weather information at the destination, a station map and all those features with strong focus on privacy.

In detail I found some small things to improve:

  • If you search for a bus ride and enter the correct name for the bus stop, it will still add some walk from and to the station. The issue here is that we use different backends and not all backends share the same geo coordinate. That's why Itinerary needs to add some heuristics to delete those paths.

  • Instead of displaying just a small station map of one bus stop in the inner city, it showed complete Würzburg inner city, as there is one big park around the inner city (named "Ringpark").

  • Würzburg has a quite big bus station but the platform information were missing in the map, so we tweaked the CSS to display the platform. To be sure, that we don't fix only Würzburg, we also looked at Greifswald and Aix-en-Provence if they are following the same name scheme.

I additionally learned that it has a lot of details that helps people who have special needs. That is the reason why Daniel Kraut wants to get Itinerary available for iOS. As spoken out, that Daniel wants to reach this goal, others already started to implement the first steps to build apps for iOS.

This year I was volunteering in helping out at Akademy. For me it was a lot of fun to meet everyone at the infodesk or help the speakers setup the beamer and microphone. It is also a good opportunity to meet many new faces and get in contact with them. I see also room for improvement. As we were quite busy at the Welcome Event to get out the badges to everyone, I couldn't answer the questions from newcomers, as the queue was too long. I propose that some people volunteer to be available for questions from newcomers. Often it is hard for newcomers to get their first contact(s) in a new community. There is a lot of space for improvement to make it easier for newcomers to join. Some ideas in my head are: Make an event for the newcomers to get them some links into the community and show that everyone is friendly. The tables at the BoFs should make a circle, so everyone can see each other. It was also hard for me to understand everyone as they mostly spoken towards the front. And then BoFs are sometimes full of very specific words and if you are not already deep in the topic you are lost. I can see the problem, on the one side BoFs are also the place where the person that knows the topic already wants to get things done. On the other side new comers join BoFs, are overwhelmed by to many new words get frustrated and think, that they are not welcome. Maybe at least everyone should present itself with name and ask new faces, why they joined the BoF to help them joining.

I'm happy, that the food provided for the attendees was very delicious and that I'm not the only one mostly vegetarian with a big amount to be vegan.

At the conference the KDE Eco initiation really caught me, as I see a lot of new possibilities in giving more reasons to switch to an Open Source system. The talk from Natalie was great to see how pupils get excited about Open Source and also help their grandparents to move to a Linux system. As I also will start to work as a teacher, I really got ideas what I can do at school. Together with Joseph and Nicole, we finally started to think about how to drive an exploration on what kind of old hardware is still KDE software running. The ones with the oldest hardware will get an old KDE shirt. For more information see #40.

The conference was very motivating for me, I also had still energy at the evening to do some Debian packaging and finally pushed kweathercore to Debian and started to work on KWeather. Now I'm even more interested in the KDE apps focusing the mobile world, as I now have some hardware that can actually use those apps.

I really enjoyed the workshop how to contribute to Qt by Volker Hilsheimer, especially the way how Volker explained things in a very friendly way, answered every question, sometime postponed some questions but came back to them later. All in all I now have a good overview how Qt is doing development and how I can fix bugs.

The daytrip to Rothenburg ob der Tauber was very interesting for me. It was the first time I visited the village. But in my memory it feels like I know the village already. I grew up with reading a lot of comic albums including the good SiFi comic album series "Yoku Tsuno" created by the Belgian writer Roger Leloup. Yoku Tsuno is an electronics engineer, raised in Japan but now living in Belgium. In "On the edge of life" she helps her friend Ingard, who actually lives in Rothenburg. Leloup invested a lot of time to travel to make the make his drawings as accurate as possible.

In order to not have a hard cut from Akademy to normal life, I had a lunch with Carlos, to discuss KDE Neon and how we can improve the interaction with Debian. In the future this should have less friction and make both communities work together more smoothly. Additionally as I used to develop on KDEPIM with the help of Docker images based on Neon I ask for a meta kf6 dev meta package. That should help to get rid of most hand written lists of dev packages in the Docker file in order to make it more simple for new contributors to start hacking on KDEPIM.

The rest of the day I finally found time to do the normal tourist stuff: Going to the Wine bridge and having a walk to the castle of Würzburg. Unfortunately you hear a lot of car noises up there, but I could finally relaxe in a Japanese designed garden.

Finally at Saturday I started my trip back. The trains towards Eberswalde are broken and I needed to find alternative routing. I got a little bit nervous, as it was the first time I travelled with my Librem 5 and Itinerary only and needed to reach the next train in less than two mins. With the indoor maps provided, I could prepare my run through the train station so I reached successfully my next train.

By the way, also if you only only use KDE software, I would recommend everyone to join Akademy ;)

Categories: FLOSS Project Planets

KDE Plasma 6.2.4, Bugfix Release for November

Planet KDE - Mon, 2024-11-25 19:00

Tuesday, 26 November 2024. Today KDE releases a bugfix update to KDE Plasma 6, versioned 6.2.4.

Plasma 6.2 was released in October 2024 with many feature refinements and new modules to complete the desktop experience.

This release adds three weeks' worth of new translations and fixes from KDE's contributors. The bugfixes are typically small but important and include:

  • libkscreen Doctor: clarify the meaning of max. brightness zero. Commit. Fixes bug #495557
  • Plasma Workspace: Battery Icon, Add headphone icon. Commit.
  • Plasma Audio Volume Control: Fix speaker test layout for Pro-Audio profile. Commit. Fixes bug #495752
View full changelog
Categories: FLOSS Project Planets

Ruqola 2.3.2

Planet KDE - Mon, 2024-11-25 19:00

Ruqola 2.3.2 is a feature and bugfix release of the Rocket.chat app.

It includes many fixes for RocketChat 7.0.

New features:

  • Fix administrator refresh user list
  • Fix menu when we select video conference message
  • Fix RocketChat 7.0 server support
  • Fix create video message
  • Fix update cache when we change video/attachment description
  • Fix export message job
  • Fix show userOffline when we have a group
  • Fix enable/disable ok button when search room in team dialog
  • Fix crash when we remove room in team dialog
  • Fix update channel selection when we reconnect server

URL: https://download.kde.org/stable/ruqola/
Source: ruqola-2.3.2.tar.xz
SHA256: 57c8ff6fdeb4aba286425a1bc915db98ff786544a3ada9dec39056ca4b587837
Signed by: E0A3EB202F8E57528E13E72FD7574483BB57B18D Jonathan Riddell jr@jriddell.org
https://jriddell.org/jriddell.pgp

Categories: FLOSS Project Planets

Talking Drupal: Talking Drupal #477 - Drupal Association CTO Then & Now

Planet Drupal - Mon, 2024-11-25 15:00

Today we are talking about being the CTO of the Drupal Association, How the job has changed, and How its impacted Drupal with guests Josh Mitchell & Tim Lehnen. We’ll also cover Automatic Anchors as our module of the week.

For show notes visit: https://www.talkingDrupal.com/477

Topics
  • How long ago were you CTO Josh
  • Tim when did you take over
  • DA infrastructure
  • Drupal Credit System
  • Josh's proudest moment
  • Tim's proudest moment
  • Growth
  • Josh if you could do one thing differently
  • Tim if you could make one change
  • Future of the CTO job
Resources Guests

Tim Lehnen - aspenthornpress.com hestenet

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Joshua "Josh" Mitchell - joshuami.com joshuami

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you ever wanted headings on your Drupal site to have unique id values, so links can be created to take users to specific parts of any page? There’s a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Jun 2020 by Chris Komlenic (komlenic) of Penn State
    • Versions available: 2.1.1-beta1, which supports Drupal 8.8, 9, and 10
  • Maintainership
    • Test coverage
    • Number of open issues: x open issues, y of which are bugs against the current branch
  • Usage stats:
    • 137 sites
  • Module features and usage
    • By default, the module automatically generates ids on , , , , and elements within the page content
    • Even if two headings have the same content, the module will make sure their ids are unique, as well as making sure they are i18n-friendly, use hyphens instead of spaces, and are short enough to be useful
    • The module won’t interfere with or change manually-added or already-existing HTML ids
    • There’s a permission to view helpful links on each heading that the ids obvious and easy to copy
    • Configuration options include the root element it should look within (defaults to the body tag), which elements should get ids, what content to use for the displayed links, and whether or not generate ids on admin pages
Categories: FLOSS Project Planets

Pages