Feeds

Python Bytes: #409 We've moved to Hetzner write-up

Planet Python - Thu, 2024-11-14 03:00
<strong>Topics covered in this episode:</strong><br> <ul> <li><a href="https://github.com/willmcgugan/terminal-tree?featured_on=pythonbytes"><strong>terminal-tree</strong></a></li> <li><strong><a href="https://posting.sh?featured_on=pythonbytes">posting: The API client that lives in your terminal</a></strong></li> <li><strong>Extra, extra, extra</strong></li> <li><strong><a href="https://micro.webology.dev/2024/11/03/uv-does-everything.html?featured_on=pythonbytes">UV does everything or enough that I'm not sure what else it needs to do</a></strong></li> <li><strong>Extras</strong></li> <li><strong>Joke</strong></li> </ul><a href='https://www.youtube.com/watch?v=vg6VLG0jKek' style='font-weight: bold;'data-umami-event="Livestream-Past" data-umami-event-episode="409">Watch on YouTube</a><br> <p><strong>About the show</strong></p> <p>Sponsored by:</p> <ul> <li><a href="https://pythonbytes.fm/scout"><strong>ScoutAPM</strong></a> - Django Application Performance Monitoring</li> <li><a href="https://pythonbytes.fm/codeium"><strong>Codeium</strong></a> - Free AI Code Completion &amp; Chat </li> </ul> <p><strong>Connect with the hosts</strong></p> <ul> <li>Michael: <a href="https://fosstodon.org/@mkennedy"><strong>@mkennedy@fosstodon.org</strong></a></li> <li>Brian: <a href="https://fosstodon.org/@brianokken"><strong>@brianokken@fosstodon.org</strong></a></li> <li>Show: <a href="https://fosstodon.org/@pythonbytes"><strong>@pythonbytes@fosstodon.org</strong></a></li> </ul> <p>Join us on YouTube at <a href="https://pythonbytes.fm/stream/live"><strong>pythonbytes.fm/live</strong></a> to be part of the audience. Usually <strong>Monday</strong> at 10am PT. Older video versions available there too.</p> <p>Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to <a href="https://pythonbytes.fm/friends-of-the-show">our friends of the show list</a>, we'll never share it.</p> <p><strong>Michael #1:</strong> <a href="https://github.com/willmcgugan/terminal-tree?featured_on=pythonbytes"><strong>terminal-tree</strong></a></p> <ul> <li>An experimental filesystem navigator for the terminal, built with <a href="https://github.com/textualize/textual?featured_on=pythonbytes">Textual</a></li> <li>Tested in macOS only at this point. Chances are very high it works on Linux. Slightly lower chance (but non-zero) that it works on Windows. <ul> <li>Can confirm it works on Linux</li> </ul></li> </ul> <p><strong>Brian #2:</strong> <a href="https://posting.sh?featured_on=pythonbytes">posting: The API client that lives in your terminal</a></p> <ul> <li>Also uses Textual</li> <li>From Darren Burns</li> <li>Interesting that the installation instructions recommends using uv: <ul> <li>uv tool install --python 3.12 posting</li> </ul></li> <li>Very cool. Great docs. Beautiful. keyboard centric, but also usable with a mouse.</li> <li>“Fly through your API workflow with an approachable yet powerful <strong>keyboard-centric</strong> interface. Run it locally or <strong>over SSH</strong> on remote machines and containers. Save your requests in a readable and <strong>version-control friendly</strong> format.”</li> <li>Able to save multiple environments</li> <li>Great colors</li> <li>Allows scripting to run Python code before and after requests to prepare headers, set variables, etc.</li> </ul> <p><strong>Michael #3:</strong> <strong>Extra, extra, extra</strong></p> <ul> <li><a href="https://training.talkpython.fm/courses/getting-started-with-spacy?featured_on=pythonbytes">spaCy course</a> swag give-away, <a href="https://forms.gle/MJPWh3VCB58Peegj7?featured_on=pythonbytes">enter for free</a></li> <li>New essay: <a href="https://mkennedy.codes/posts/opposite-of-cloud-native-is-stack-native/?featured_on=pythonbytes">Opposite of Cloud Native is?</a></li> <li>News: <a href="https://talkpython.fm/blog/posts/we-have-moved-to-hetzner/?featured_on=pythonbytes">We've moved to Hetzner</a></li> <li>New package: <a href="https://mkennedy.codes/posts/introducing-the-chameleon-flask-package/?featured_on=pythonbytes">Introducing chameleon-flask package</a></li> <li>New release: <a href="https://github.com/mikeckennedy/listmonk?featured_on=pythonbytes">Listmonk Python client</a></li> <li><a href="https://www.tiobe.com/tiobe-index/?featured_on=pythonbytes">TIOBE Update</a></li> <li><a href="https://peps.python.org/pep-0750/?featured_on=pythonbytes">PEP 750 – Template Strings</a></li> <li><a href="https://canarymail.io?featured_on=pythonbytes">Canary email</a></li> <li>Left Omnivore, for Pocket, left Pocket for, …, landed on <a href="https://www.instapaper.com?featured_on=pythonbytes">Instapaper</a> <ul> <li>Supports direct import from Omnivore and Pocket</li> <li>Though <a href="https://hoarder.app/?featured_on=pythonbytes">Hoarder</a> is compelling</li> </ul></li> <li>Trying out <a href="https://zen-browser.app/?featured_on=pythonbytes">Zen Browser</a> <ul> <li>Wasn’t a fan of Arc (<a href="https://www.yahoo.com/tech/arc-browser-creator-moving-project-151945233.html?featured_on=pythonbytes">especially</a><a href="https://www.yahoo.com/tech/arc-browser-creator-moving-project-151945233.html?featured_on=pythonbytes"> now)</a> but the news turned me on to Zen</li> </ul></li> </ul> <p><strong>Brian #4:</strong> <a href="https://micro.webology.dev/2024/11/03/uv-does-everything.html?featured_on=pythonbytes">UV does everything or enough that I'm not sure what else it needs to do</a></p> <ul> <li>Jeff Triplett</li> <li>“UV feels like one of those old infomercials where it solves everything, which is where we have landed in the Python world.”</li> <li>“My favorite feature is that UV can now bootstrap a project to run on a machine that does not previously have Python installed, along with installing any packages your application might require.”</li> <li>Partial list (see Jeff’s post for his complete list) <ul> <li>uv pip install replaces pip install</li> <li>uv venv replaces python -m venv</li> <li>uv run, uv tool run, and uv tool install replaces pipx</li> <li>uv build - Build your Python package for pypi</li> <li>uv publish - Upload your Python package to pypi, replacing twine and flit publish</li> </ul></li> </ul> <p><strong>Extras</strong> </p> <p>Brian:</p> <ul> <li><a href="https://nedbatchelder.com/blog/202411/coveragepy_originally.html?featured_on=pythonbytes">Coverage.py originally </a>was just one file</li> <li>Trying out BlueSky <a href="https://bsky.app/profile/brianokken.bsky.social?featured_on=pythonbytes">brianokken.bsky.social</a> <ul> <li>Not because of Taylor Swift, but nice. </li> <li>There are a lot of Python people there.</li> </ul></li> </ul> <p><strong>Joke:</strong> <a href="https://devhumor.com/media/how-programmers-sleep?featured_on=pythonbytes">How programmers sleep</a></p>
Categories: FLOSS Project Planets

Stefan Scherfke: Publishing to PyPI with a Trusted Publisher from GitLab CI/CD

Planet Python - Thu, 2024-11-14 01:18

PyPA’s Trusted Publishers let you upload Python packages directly from your CI pipeline to PyPI. And you don’t need any long-lived secrets like API tokens. This makes uploading Python packages not only easier than ever and more secure, too.

In this article, we’ll look at what Trusted Publishers are and how they’re more secure than using API tokens or a username/password combo. We’ll also learn how to set up our GitLab CI/CD pipeline to:

  • continuously test the release processes with the TestPyPI on every push to main,
  • automatically perform PyPI releases on every Git tag, and
  • additionally secure the process with GitLab (deployment) environments.

The official documentation explains most of this, but it doesn’t go into much depth regarding GitLab pipelines and leaves a few details unexplained.

Why should I want to use this?

API tokens aren’t inherently insecure, but they do have a few drawbacks:

  • If they are passed as environment variables, there’s a chance they’ll leak (think of a debug env | sort command in your pipeline).
  • If you don’t watch out, bad co-maintainers can steal the token and do mischief with it.
  • You have to manually renew the token from time to time, which can be annoying in the long run.

Trusted Publishers can avoid these problems or, at the very least, reduce their risk:

  • You don’t have to manually renew any long-lived tokens.
  • All tokens are short-lived. Even if they leak, they can’t be misused for long.

After we’ve learned how Trusted Publishers and protected GitLab environments work, we will take another look at security considerations.

How do Trusted Publishers work?

The basic idea of Trusted Publishers is quite simple:

  • In PyPI’s project settings, you add a Trusted Publisher and configure it with the GitLab URL of your project.
  • PyPI will then only accept package uploads if the uploader can prove that the upload comes from a CI pipeline of that project.

The technical process behind this is based on the OpenID Connect (OIDC) standard.

Essentially, the process works like this:

  • In your CI pipeline, you request an ID token for PyPI.
  • GitLab injects the short-lived token into your pipeline as a (masked) environment variable. It is cryptographically signed by GitLab and contains, among other things, your project’s path with namespace.
  • You use this token to authenticate with PyPI and request another token for the actual package upload.
  • This API token can now be used just like “normal” project-scoped API tokens.

The Trusted Publishers documentation explains this in more detail.

One problem remains, though: An ID token can be requested in any pipeline job and in any branch. Malicious contributors could sneak in a pipeline job and make a corrupted release.

This is where environments come in.

Environments

GitLab environments represent your deployed code in your infrastructure. Think of your code running in a container in your production or testing Kubernetes cluster; or your Python package living on PyPI. :-)

The most important feature of environments in this context is access control: You can protect environments, restricting deployments to them. For protected environments, you can define users or roles that are allowed to perform deployments and that must approve deployments. For example, you could restrict deployments (uploads to PyPI) to all maintainers of your project, but only after you yourself have approved each release.

Note

Protected environments are a premium feature.

Non-profit open source projects/organizations can apply for a free ultimate subscription.

It seems that very old projects also have this feature enabled. Otherwise I can’t explain why I have it for Typed Settings but not for my other projects…

To use an environment in your CI/CD pipeline, you need to add it to a job in the .gitlab-ci.yml.

If we also store the name of the environment in the PyPI deployment settings, only uploads from that environment will be allowed, i.e. only uploads that have been authorized by selected people.

Only maintainers can deploy to the release environment and only after Stefan approved it. Only maintainers can deploy to the release environment and only after Stefan approved it. Security Considerations

The last two sections have already hinted at this: GitLab environments are only truly secure if you can protect them.

Let’s take a step back and consider what threats we’re trying to protect against, so that we’ll then be able to choose the right approach:

  1. Random people doing a merge request for your project.
  2. Contributors with the developer role committing directly into your project.
  3. Co-maintainers with more permissions then a developer.
  4. A Jia Tan which you trust even more than the other maintainers.

What can we do about it?

  1. Code in other people’s forks doesn’t have access to your project’s CI variables nor can it request OIDC ID tokens in your project’s name. But you need to carefully review each MR!
  2. Contributors with only developer permissions can still request ID tokens. If you cannot use protected environments, using an API token stored in a protected CI/CD variable is a more secure approach. You should also protect your main branch and all tags (using the * pattern), so that devleopers only have access to feature branches. You’ll find it under Settings → Repository → Protected branches/tags.
  3. Protected CI/CD variables do not protect you from malicious maintainers, though. Even if you only allow yourself to create tags, other maintainers still have access to protected variables. Protected environments with only a selected set of approvers is the most secure approach.
  4. If a very trusted co-maintainer becomes malicious, there’s very little you can do. Carefully review all commits and read the audit logs (Secure → Audit Events).

So that means for you:

  • If you are the only maintainer of a small open source project, just use a Trusted Publisher with (unprotected) environments.
  • If you belong to a larger project with multiple maintainers, consider applying for GitLab for Open Source and use a Trusted Publisher with a protected environment.
  • If there are multiple contributors and you don’t have access to protected environments, use an API token stored in a protected CI/CD variable and try only grant developer permissions to contributors.

See also

Please also read about the security model and considerations in the PyPa docs.

Putting it all together

Configuring your GitLab project to use a trusted publisher involves three main steps:

  1. Update your project’s publishing settings on PyPI and TestPyPI.
  2. Update the CI/CD settings for your GitLab project.
  3. Update your project’s pyproject.toml and .gitlab-ci.yml.
PyPI Settings

Tell PyPI to trust your GitLab CI pipelines.

  1. Log in to PyPI and go to your account’s Publishing settings. Here, you can manage and add trusted publishers for your project.
  2. Add a new trusted publisher for GitLab as shown in the screenshot below.

    Enter your project’s namespace (your GitLab username or the name of your organization), the project name, the filename of your CI def (usually .gitlab-ci.yml).

    Use release as the environment name!

  3. Repeat the same steps for the TestPyPI, but use release-test as environment name.
Add a trusted publisher on PyPI. Add a trusted publisher on PyPI. GitLab CI/CD Settings

You need to create two environments and protect the one for production releases.

  1. Open your project in GitLab, then go to Operate → Environments and click Create an environment to create the production environment:

    • Title: release
    • Description: PyPI releases (or whatever you want)
    • External URL: https://pypi.org/project/{your-project}/ (the URL is displayed in a few places in GitLab and helps you to quickly navigate to your project on PyPI.)

    Click Save.

Add an environment for your deployments in Gitlab. Add an environment for your deployments in Gitlab.
  1. Click New environment (in the top right corner) to create the test environment:

    • Title: release-test
    • Description: TestPyPI releases (or whatever you want)
    • External URL: https://test.pypi.org/project/{your-project}/

    Click Save.

  2. If protected environments are available (see the note above), navigate to Settings → CI/CD and open the Protected environments section. Click the Protect an environment button.

    • Select environment: release
    • Allowed to deploy: Choose a role or user, e.g. Maintainers.
    • Approvers: Choose a role or user, e.g. yourself.
Restrict who can deploy into the release environment (and thus, upload to PyPI). Restrict who can deploy into the release environment (and thus, upload to PyPI). Changes in Project Files

In order to be able to upload each commit to the TestPyPI, we need a different version for each build. To achieve this, we can use hatch-vcs, setuptools_scm, or similar.

In the following example, we are going to use hatchling with hatch-vcs as the build backend and uv for everything else.

  1. We configure the build backend in our pyproject.toml as follows:

    [build-system] requires = ["hatchling", "hatch-vcs"] build-backend = "hatchling.build" [tool.hatch.version] source = "vcs" raw-options = { local_scheme = "no-local-version" } # TestPyPI lacks support for this [project] dynamic = ["version"]

    Hint

    Versions with a local component cannot be uploaded to to (Test)PyPI, so we must disable this feature.

  2. Now lets open our project’s .gitlab-ci.yml which we’ll edit during the next steps.

    Hint

    The snippets in the next steps only show fragments of the .gitlab-ci.yml. I’ll post the complete file at the end of the article.

  3. We need at least a build and a deploy stage:

    stages: - 'build' # - 'test' # - ... - 'deploy'
  4. Python build tools usually put their artifacts (binary wheels and source distributions) into dist/. This directory needs to be added to your pipeline artifacts, so that these files are available in later pipeline jobs:

    build: stage: 'build' script: - 'uv build --out-dir=dist' artifacts: paths: - 'dist/'
  5. For our use-case, we need two release jobs: One that uploads to the TestPyPI on each push (release-test) and one that uploads to the PyPI in tag pipelines (release).

    Since both jobs are nearly the same, we’ll also define an “abstract base job” .release-base which the other two extend.

    Hint

    To improve readability and avoid issues with excaping, we’ll use YAML multiline strings.

    The >- operator joins the following lines without a line break and strips additional whitespace.

    See yaml-multiline.info for details.

    .release-base: # Abstract base job for "release" jobs. # Extending jobs must define the following variables: # - PYPI_OIDC_AUD: Audience for the ID token that GitLab # issues to the pipeline job # - PYPI_OIDC_URL: PyPI endpoint for retrieving a publish # token with GitLab’s ID token # - UV_PUBLISH_URL: PyPI endpoint for the actual upload stage: 'deploy' id_tokens: PYPI_ID_TOKEN: aud: '$PYPI_OIDC_AUD' script: # Use the GitLab ID token to retrieve an API token from PyPI - >- resp="$(curl -X POST "${PYPI_OIDC_URL}" -d "{\"token\":\"${PYPI_ID_TOKEN}\"}")" # Parse the response and extract the token - >- publish_token="$(python -c "import json; print(json.load('${resp}')['token'])")" # Upload the files from "dist/" - 'uv publish --token "$publish_token"' # Print the link to PyPI so we can quickly go there to verify the result: - 'version="$(uv run --with hatch-vcs hatchling version)"' - 'echo -e "\033[34;1mPackage on PyPI:\033[0m ${CI_ENVIRONMENT_URL}${version}/"'
  6. Now we can add the release-test job. It extends .release-base, defines variables for the base job, and rules for when the job should run:

    release-test: extends: '.release-base' rules: # Only run if it's a pipeline for the default branch or a tag: - if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH || $CI_COMMIT_TAG' environment: name: 'release-test' url: 'https://test.pypi.org/project/typed-settings/' variables: PYPI_OIDC_AUD: 'testpypi' PYPI_OIDC_URL: 'https://test.pypi.org/_/oidc/mint-token' UV_PUBLISH_URL: 'https://test.pypi.org/legacy/'
  7. The release job looks very similar, but the variables have different values and the job only runs in tag pipelines.

    release: extends: '.release-base' rules: # Only run in tag pipelines: - if: '$CI_COMMIT_TAG' environment: name: 'release' url: 'https://pypi.org/project/typed-settings/' variables: PYPI_OIDC_AUD: 'pypi' PYPI_OIDC_URL: 'https://pypi.org/_/oidc/mint-token' UV_PUBLISH_URL: 'https://upload.pypi.org/legacy/'
The output of the release will look like this. There’s also a link that takes you directly to the release on PyPI. The output of the release will look like this. There’s also a link that takes you directly to the release on PyPI.

That’s it. You should now be able to automatically create PyPI releases directly from your GitLab CI/CD pipeline. 🎉

A successful GitLab CI/CD pipeline for Typed Settings’ v24.6.0 release. A successful GitLab CI/CD pipeline for Typed Settings’ v24.6.0 release.

If you run into any problems, you can

You can leave comments over at Mastodon or Bluesky.

And, as promised, here is the complete (but still minimal) .gitlab-ci.yml from the snippets above. If you want to see a real-world example, you can take a look at Typed Settings pipeline definition.

# .gitlab-ci.yml stages: - 'build' # - 'test' # - ... - 'deploy' build: stage: 'build' script: - 'uv build --out-dir=dist' artifacts: paths: - 'dist/' .release-base: # Abstract base job for "release" jobs. # Extending jobs must define the following variables: # - PYPI_OIDC_AUD: Audience for the ID token that GitLab issues to the pipeline job # - PYPI_OIDC_URL: PyPI endpoint for retrieving a publish token with GitLab’s ID token # - UV_PUBLISH_URL: PyPI endpoint for the actual upload stage: 'deploy' id_tokens: PYPI_ID_TOKEN: aud: '$PYPI_OIDC_AUD' script: - >- resp="$(curl -X POST "${PYPI_OIDC_URL}" -d "{\"token\":\"${PYPI_ID_TOKEN}\"}")" - >- publish_token="$(python -c "import json; print(json.load('${resp}')['token'])")" - 'uv publish --token "$publish_token"' - 'version="$(uv run --with hatch-vcs hatchling version)"' - 'echo -e "\033[34;1mPackage on PyPI:\033[0m ${CI_ENVIRONMENT_URL}${version}/"' release-test: extends: '.release-base' rules: - if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH || $CI_COMMIT_TAG' environment: name: 'release-test' url: 'https://test.pypi.org/project/typed-settings/' variables: PYPI_OIDC_AUD: 'testpypi' PYPI_OIDC_URL: 'https://test.pypi.org/_/oidc/mint-token' UV_PUBLISH_URL: 'https://test.pypi.org/legacy/' release: extends: '.release-base' rules: - if: '$CI_COMMIT_TAG' environment: name: 'release' url: 'https://pypi.org/project/typed-settings/' variables: PYPI_OIDC_AUD: 'pypi' PYPI_OIDC_URL: 'https://pypi.org/_/oidc/mint-token' UV_PUBLISH_URL: 'https://upload.pypi.org/legacy/'
Categories: FLOSS Project Planets

Gizra.com: Drupal on Azure - Forging Docker Image and Beyond

Planet Drupal - Wed, 2024-11-13 19:00
When you switch from PaaS to IaaS, suddenly you have a series of new responsibilities. There is a lot to learn and also a lot to mimic from existing PaaS providers. We would like to share our experience after successfully migrating four larger Drupal sites from Pantheon to Azure cloud. The overview is in chronological order, detailing how we proceeded with the implementation. Infrastructure Basics We worked with an excellent infrastructure team who provisioned the following architecture for us:
Categories: FLOSS Project Planets

Seth Michael Larson: Early promising results with SBOMs and Python packages

Planet Python - Wed, 2024-11-13 19:00
Early promising results with SBOMs and Python packages AboutBlogCool URLs Early promising results with SBOMs and Python packages

Published 2024-11-14 by Seth Larson
Reading time: minutes

This critical role would not be possible without funding from the Alpha-Omega project. Massive thank-you to Alpha-Omega for investing in the security of the Python ecosystem!

I've kicked off a project to reduce the "phantom dependency" problem for Python. The phantom dependency problem is where distinct software (sometimes written in Python, but often C, C++, Rust, etc) is included in a Python package but then isn't recorded anywhere in the package metadata.

These distinct pieces of software aren't not recorded because of lack of time or awareness, there is no standardized method to record this information in Python package metadata.

This means that when a software composition analysis (SCA) tool looks at the Python package the tool will "miss" all the software that's included in the package aside from the top-level package itself.

For example, the popular Python image manipulation library "Pillow" is not only "Pillow", the wheel files contain many more libraries to comply with the "manylinux" package platform:

# (the below libraries are bundled by auditwheel) $ unzip -l pillow-11.0.0-cp312-cp312-manylinux_2_28_x86_64.whl | grep 'pillow.libs' pillow.libs/ pillow.libs/libharfbuzz-144af51e.so.0 pillow.libs/libxcb-b8a56d01.so.1.1.0 pillow.libs/libpng16-4cc6a9fc.so.16.44.0 pillow.libs/libXau-154567c4.so.6.0.0 pillow.libs/libbrotlicommon-3ecfe81c.so.1 pillow.libs/liblzma-c9407571.so.5.6.3 pillow.libs/libfreetype-e7d5437d.so.6.20.1 pillow.libs/liblcms2-e69eef39.so.2.0.16 pillow.libs/libopenjp2-05423b53.so pillow.libs/libtiff-0a86184d.so.6.0.2 pillow.libs/libjpeg-45e70d75.so.62.4.0 pillow.libs/libbrotlidec-ba690955.so.1 pillow.libs/libwebp-2fd3cdca.so.7.1.9 pillow.libs/libsharpyuv-898c0cb5.so.0.1.0 pillow.libs/libwebpdemux-f2642bcc.so.2.0.15 pillow.libs/libwebpmux-d524b4d5.so.3.1.0

I see many recognizable projects in the list of shared objects, like libjpeg, libwebp, libpng, xz-utils (liblzma), etc. If we try to scan this installed wheel with a tool like Syft we receive this report:

$ syft dir:venv ✔ Indexed file system venv ✔ Cataloged packages [2 packages] NAME VERSION TYPE pillow 11.0.0 python pip 24.2 python

Syft isn't able to find any of the compiled libraries! So if we were to run a vulnerability scanner we would only receive vulnerability records for Pillow and pip. My plan is to help fix this problem with Software Bill-of-Materials documents (SBOMs) included in a standardized way inside of Python packages.

To test how well this proposal works with today's tools, I forked auditwheel and created a rudimentary patch which:

  • For each shared library which is being bundled into a wheel, record the original file path and checksum. Bundle the shared libraries into the wheel as normal.
  • Using platform-specific manager query each file path back to the package that provides the file. In this specific case rpm was used (rpm -qf <path>) because manylinux_2_28_x86_64 uses AlmaLinux 8 as the distribution.
  • Gather information about that package using rpm, such as the name, version, etc.
  • For each package, create the intrinsic "package URL" (PURL) software identifier for later use. This includes information about the packaging format, package name, version, but also the distro and architecture. For example, the PURL for the copy of libwebp used by the wheel is: pkg:rpm/almalinux/libwebp@1.0.0-9.el8_9.1?arch=x86_64&distro=almalinux-8
  • Generate a CycloneDX SBOM file containing the above gathered information split into components and with relationship links between the top-level component (Pillow) and the bundled libraries.
  • Embed that generated SBOM file into the wheel.

Let's run through building Pillow from source and using our forked auditwheel:

# The manylinux image may differ depending on your platform. $ docker run --rm -it -v.:/tmp/wheelhouse \ quay.io/pypa/manylinux_2_28_x86_64 # Install dependencies for Pillow $ yum install --nogpgcheck libtiff-devel \ libjpeg-devel openjpeg2-devel zlib-devel \ freetype-devel lcms2-devel libwebp-devel \ tcl-devel tk-devel harfbuzz-devel \ fribidi-devel libxcb-devel # Create a virtualenv and install auditwheel fork $ /usr/local/bin/python3.12 -m venv venv $ source venv/bin/activate $ python -m pip install build $ python -m pip install git+https://github.com/sethmlarson/auditwheel@sboms # Download the Pillow source from PyPI $ python -m pip download --no-binary=pillow pillow==11.0.0 $ tar -xzvf pillow-11.0.0.tar.gz # Build a non-manylinux wheel for Pillow $ python -m build ./pillow-11.0.0/ # Repair the wheel using auditwheel $ auditwheel repair ./pillow-11.0.0/dist/ ... Fixed-up wheel written to /wheelhouse/pillow-11.0.0-cp312-cp312-manylinux_2_28_x86_64.whl # Inspect the wheel for our SBOM, there it is! $ unzip -l /wheelhouse/pillow-11.0.0-*.whl | grep '.cdx.json' 5703 11-14-2024 19:39 pillow.libs/auditwheel.cdx.json # Move the wheel outside our container $ mv /wheelhouse/pillow-11.0.0-cp312-cp312-manylinux_2_28_x86_64.whl /tmp/wheelhouse/

So now we have a wheel file that contains an SBOM partially describing its contents. Let's try installing that wheel and running Syft:

$ syft dir:venv ✔ Indexed file system /tmp/venv-pillow ✔ Cataloged packages [13 packages] NAME VERSION TYPE Pillow 11.0.0 python bzip2-libs 1.0.6-26.el8 rpm freetype 2.9.1-9.el8 rpm jbigkit-libs 2.1-14.el8 rpm lcms2 2.9-2.el8 rpm libXau 1.0.9-3.el8 rpm libjpeg-turbo 1.5.3-12.el8 rpm libpng 1.6.34-5.el8 rpm libtiff 4.0.9-33.el8_10 rpm libwebp 1.0.0-9.el8_9.1 rpm libxcb 1.13.1-1.el8 rpm openjpeg2 2.4.0-5.el8 rpm pip 24.2 python

Woo hoo! Now the proper libraries are showing up in Syft. That means we'll be able to get vulnerability information from all the contained software components. This isn't the end, there are many many MANY ways that software ends up in a Python package. This quick validation test only shows that even with today's SBOM and SCA tools that embedding SBOM documents into wheels can be useful for downstream tools. Onwards to even more! 🚀

If you're interested in this project, follow the repository on GitHub and participate in the kick-off discussion on Python Discourse.

That's all for this post! 👋 If you're interested in more you can read the last report.

Have thoughts or questions? Let's chat over email or social:

sethmichaellarson@gmail.com
@sethmlarson@fosstodon.org

Want more articles like this one? Get notified of new posts by subscribing to the RSS feed or the email newsletter. I won't share your email or send spam, only whatever this is!

Want more content now? This blog's archive has ready-to-read articles. I also curate a list of cool URLs I find on the internet.

Find a typo? This blog is open source, pull requests are appreciated.

Thanks for reading! ♡ This work is licensed under CC BY-SA 4.0

Categories: FLOSS Project Planets

drunomics: Was mir das DrupalCamp Berlin über “Volunteering” beigebracht hat

Planet Drupal - Wed, 2024-11-13 13:06
What DrupalCamp Berlin taught me about "volunteering" Alte Münze arthur.lorenz Thu, 11/14/2024 - 09:15 From hesitation to enthusiasm: when I decided to volunteer at DrupalCamp Berlin, I didn't know what to expect. In hindsight, it was an unforgettable experience that allowed me to immerse myself deeply in the community. In this post, I share behind-the-scenes insights and what it really means to help shape an event like this.
Categories: FLOSS Project Planets

Drupal Association blog: Governance in the Drupal Ecosystem

Planet Drupal - Wed, 2024-11-13 12:24

The Summary

To ensure Drupal’s stability and independence, the project is managed through a well-established, transparent governance system. Dries Buytaert, the Founder and Project Lead, helped design a model that distributes power and prevents any single person or entity — even himself — from making unilateral decisions that could alter the project unexpectedly. The independent Drupal Association oversees Drupal.org and other key infrastructure, free from commercial pressures. This approach ensures that Drupal.org is reliable and creates a fair playing field for all contributors, embodying true open-source leadership.

Just as the Drupal software has grown and changed significantly over its 23-year history, so has its governance. And, while there’s always room for improvement, it is safe to say that Drupal’s seasoned governance is what allows it to be one of the largest, independent open source projects in the world. 

The Detail

Dries Buytaert, as the founder and project lead, ultimately guides the direction of Drupal, and is responsible for shaping the project’s philosophy and core principles. 

While Dries started Drupal on his own, he has helped evolve the governance model over the years to be mature and resilient.  To help govern the project's technical aspects, Dries established the core committer team and other supporting groups. To  oversee non-technical areas, he co-founded the Drupal Association. These initiatives were intentional efforts to scale and strengthen Drupal’s governance.

On the technical side, the governance model for Drupal core is very mature, as described in the Drupal Project Governance. Technical decision-making is distributed among the core committers and other maintainers, promoting a transparent, structured, and collaborative approach to managing Drupal core.     

Many other aspects of Drupal governance are managed by the Drupal Association, which is a U.S. 501(c)3 nonprofit organization formed in 2008 to support the Drupal project and the Drupal community.  I am currently the Chief Executive Officer of the Association.  Our mission is to drive innovation and adoption of Drupal as a high-impact digital public good, hand-in-hand with our open source community.  A fundamental obligation of the Drupal Association is to ensure that Drupal is available to anyone, anywhere in the world free of charge.  We primarily accomplish this task through Drupal.org.

The Drupal Association is a bona fide non-profit organization (not a pass-through), with assets of just over $3 million and an operating budget of over $4 million. We publish our finances annually (see: Find the reports in the Accountability section of D.org).  The Association is not controlled or funded by any single entity nor does it pass revenues onto another entity.  The Association’s revenue comes from hundreds of organizations and thousands of individuals.  No single financial contributor accounts for more than 10% of our revenue. This diverse support base prevents any one entity from having too much influence.

The Drupal Association employs a full-time team of 19 professionals located throughout the world.  These people include engineers, marketers, accountants, communication staff, and program administration team members.  I say all this to demonstrate that we have the capacity to legitimately, and independently, carry out our mission.

The Drupal Association owns and controls important components of the Drupal ecosystem that allow Drupal to be one of the largest independent FOSS projects in the world.

The Drupal Association owns and/or controls the infrastructure that powers Drupal.org.  The Drupal Association has complete control over who accesses Drupal.org, how they access it, and what they can do when accessing it.  These are covered by our Terms of Service.

In administering Drupal.org, the Drupal Association controls a number of services, including:

  • The database of Drupal.org users/project contributors
  • A self-hosted GitLab instance that includes all of the Drupal code repositories for core and contrib, testing with GitLab CI and documentation through GitLab Pages
  • Drupal software packaging (the actual .zip and .tar.gz files containing Drupal code)
  • Drupal Updates (the Updates.xml feed, Automatic Updates endpoint, Secure Signing server, and Packages.Drupal.org- the composer endpoint for Drupal projects).
  • The Drupal namespace on GitHub
  • The Drupal namespace on Packagist
  • The Drupal namespace on NPM
  • The Drupal Infrastructure namespace on gitlab.com (separate from our self-hosted instance)
  • The contribution credit system
  • Usage data about Drupal core and extensions

The Drupal Association also owns and controls the primary means by which the community communicates and gathers.  We organize DrupalCons and manage Drupal Slack.  We issue The Drupal Association Newsletter and TheWeeklyDrop (together with Bob Kepford).  We control and manage Mastodon, X/Twitter, YouTube, LinkedIn (Drupal, Drupal Association, Drupal Jobs), Facebook, Instagram, and TikTok.

Drupal has the Maker/Taker Problem that nearly all open source projects face.  There are companies that profit off Drupal who don’t give back to help maintain the project.  The Drupal Association has chosen to address this issue by restructuring our Drupal Certified Partner program to focus exclusively on those companies that give back to the community.  The goal is to incentivize the creation of a culture of contribution within companies that work in Drupal that provide the Drupal Project with sufficient resources to innovate and grow.  There is always work to be done in creating a more equitable program, but it is beginning to work as we have more than doubled the number of Drupal Certified Partners in the past 15 months.

The Drupal Association is governed by a 12-person Board of Directors that meets several times a year, including two public meetings at DrupalCons.  Nine directors are selected by a Nominating Committee of the board and two directors are elected by members of the Drupal Association.  The final seat is the “Founding Director”.  This is a voting seat that can only be filled by Dries Buytaert.  Like all board seats, this is an unpaid, voluntary role that carries with it a single vote on the board.  It has to be approved annually by the Board of Directors. Except for the trademark licensing, the Drupal Association has no contracts or agreements with Dries Buytaert or the Drupal Project, and Dries receives no funding from the Drupal Association or its operation of Drupal.org.

Dries Buytaert owns the trademark “Drupal”.  He has transparently communicated the Drupal Trademark and Logo Policy by which these are governed.  Under the policy, any changes to the policy go into effect sixty (60) days after publication.  Dries Buytaert also owns the domain names “drupal.org”, “drupal.com” and “drupalcon.org”.

Dries has granted the Drupal Association an exclusive license to use “Drupal”, “Drupal.org”, and “DrupalCon” and a non-exclusive license to use Drupal for non-commercial uses.  This license allows the Drupal Association to support the Drupal Project by providing the infrastructure to host and maintain the official version of Drupal and to organize its contributors.  It also allows the Association to support the Drupal Community in their work with Drupal.

The net effect of this arrangement is that Dries Buytaert retains ultimate control over what software can be named “Drupal” and what website can be named “Drupal.org.”  He can thus ensure that any software that calls itself “Drupal” or website that uses “Drupal.org” conforms with his vision.  This would likely cause the Drupal Association to fork the software and maintain it under a new name and url.  The high cost of such an action to both parties makes this option highly unlikely and unable to execute quickly.

What the trademark does not allow him to do is to block any person or organization from using any component of Drupal core or any modules housed on Drupal.org.  Those decisions are the sole discretion of the Drupal Association.  To date, we have exercised this authority in a very limited manner to protect and safeguard the website and its content from attacks and misuse.

Twenty-three years ago, Dries chose to release Drupal under an open-source license, inspiring tens of thousands to build careers and champion an Open Web. However, fulfilling this vision required more than just a General Public License. By creating the Drupal Association, setting up Drupal core's governance, and licensing the trademark, Dries ensured Drupal remained open-source without commercial entanglements, securing a strong, independent foundation.

Along with Dries Buytaert and many contributors, the Drupal Association is focused on the future of Drupal (see: Starshot Initiative). How can we support its adoption through marketing and create sustainable revenue streams for Drupal to flourish?  These are tough questions that confront many open source projects.  Our governance allows us to move forward in this work with great certainty.

Categories: FLOSS Project Planets

Bojan Mihelac: Building docker images with private python packages

Planet Python - Wed, 2024-11-13 11:37
Installing private python packages into Docker container can be tricky because container does not have access to private repositories and you do not want to leave trace of private ssh key in docker…
Categories: FLOSS Project Planets

Bojan Mihelac: Django app name translation in admin

Planet Python - Wed, 2024-11-13 11:37
"Django app name translation in admin" is small drop-in django application that overrides few admin templates thus allowing app names in Django admin to be translated.
Categories: FLOSS Project Planets

Bojan Mihelac: Django-simpleadmindoc updated

Planet Python - Wed, 2024-11-13 11:37
create documentation for django website
Categories: FLOSS Project Planets

Bojan Mihelac: Rename uploaded files to ASCII charset in Django

Planet Python - Wed, 2024-11-13 11:37
Telling Django to rename all uploaded files in ASCII encoding is easy and takes only two steps.
Categories: FLOSS Project Planets

Real Python: Python Dictionary Comprehensions: How and When to Use Them

Planet Python - Wed, 2024-11-13 09:00

Dictionary comprehensions are a concise and quick way to create, transform, and filter dictionaries in Python. They can significantly enhance your code’s conciseness and readability compared to using regular for loops to process your dictionaries.

Understanding dictionary comprehensions is crucial for you as a Python developer because they’re a Pythonic tool for dictionary manipulation and can be a valuable addition to your programming toolkit.

In this tutorial, you’ll learn how to:

  • Create dictionaries using dictionary comprehensions
  • Transform existing dictionaries with comprehensions
  • Filter key-value pairs from dictionaries using conditionals
  • Decide when to use dictionary comprehensions

To get the most out of this tutorial, you should be familiar with basic Python concepts, such as for loops, iterables, and dictionaries, as well as list comprehensions.

Get Your Code: Click here to download the free sample code that you’ll use to learn about dictionary comprehensions in Python.

Creating and Transforming Dictionaries in Python

In Python programming, you’ll often need to create, populate, and transform dictionaries. To do this, you can use dictionary literals, the dict() constructor, and for loops. In the following sections, you’ll take a quick look at how to use these tools. You’ll also learn about dictionary comprehensions, which are a powerful way to manipulate dictionaries in Python.

Creating Dictionaries With Literals and dict()

To create new dictionaries, you can use literals. A dictionary literal is a series of key-value pairs enclosed in curly braces. The syntax of a dictionary literal is shown below:

Python Syntax {key_1: value_1, key_2: value_2,..., key_N: value_N} Copied!

The keys must be hashable objects and are commonly strings. The values can be any Python object, including other dictionaries. Here’s a quick example of a dictionary:

Python >>> likes = {"color": "blue", "fruit": "apple", "pet": "dog"} >>> likes {'color': 'blue', 'fruit': 'apple', 'pet': 'dog'} >>> likes["hobby"] = "guitar" >>> likes {'color': 'blue', 'fruit': 'apple', 'pet': 'dog', 'hobby': 'guitar'} Copied!

In this example, you create dictionary key-value pairs that describe things people often like. The keys and values of your dictionary are string objects. You can add new pairs to the dictionary using the dict[key] = value syntax.

Note: To learn more about dictionaries, check out the Dictionaries in Python tutorial.

You can also create new dictionaries using the dict() constructor:

Python >>> dict(apple=0.40, orange=0.35, banana=0.25) {'apple': 0.4, 'orange': 0.35, 'banana': 0.25} Copied!

In this example, you create a new dictionary using dict() with keyword arguments. In this case, the keys are strings and the values are floating-point numbers. It’s important to note that the dict() constructor is only suitable for those cases where the dictionary keys can be strings that are valid Python identifiers.

Using for Loops to Populate Dictionaries

Sometimes, you need to start with an empty dictionary and populate it with key-value pairs dynamically. To do this, you can use a for loop. For example, say that you want to create a dictionary in which keys are integer numbers and values are powers of 2.

Here’s how you can do this with a for loop:

Python >>> powers_of_two = {} >>> for integer in range(1, 10): ... powers_of_two[integer] = 2**integer ... >>> powers_of_two {1: 2, 2: 4, 3: 8, 4: 16, 5: 32, 6: 64, 7: 128, 8: 256, 9: 512} Copied!

In this example, you create an empty dictionary using an empty pair of curly braces. Then, you run a loop over a range of integer numbers from 1 to 9. Inside the loop, you populate the dictionary with the integer numbers as keys and powers of two as values.

The loop in this example is readable and clear. However, you can also use dictionary comprehension to create and populate a dictionary like the one shown above.

Introducing Dictionary Comprehensions Read the full article at https://realpython.com/python-dictionary-comprehension/ »

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Mike Driscoll: ANN – The textual-cogs Package – Creating Reusable Dialogs for Textual

Planet Python - Wed, 2024-11-13 08:30

Textual-cogs is a collection of Textual dialogs that you can use in your Textual application. You can see a quick demo of the dialogs below:

Dialogs included so far:

  • Generic MessageDialog – shows messages to the user
  • SaveFileDialog – gives the user a way to select a location to save a file
  • SingleChoiceDialog – gives the user a series of choices to pick from
  • TextEntryDialog – ask the user a question and get their answer using an Input widget
  • and more

You can check out textual-cogs on GitHub.

Installation

 

You can install textual-cog using pip:

python -m pip install textual-cog

You also need Textual to run these dialogs.

Example Usage

 

Here is an example of creating a small application that opens the MessageDialog immediately. You would normally open the dialog in response to a message or event that has occurred, such as when the application has an error or you need to tell the user something.

from textual.app import App from textual.app import App, ComposeResult from textual_cogs.dialogs import MessageDialog from textual_cogs import icons class DialogApp(App): def on_mount(self) -> ComposeResult: def my_callback(value: None | bool) -> None: self.exit() self.push_screen( MessageDialog( "What is your favorite language?", icon=icons.ICON_QUESTION, title="Warning", ), my_callback, ) if __name__ == "__main__": app = DialogApp() app.run()

When you run this code, you will get something like the following:

 

Creating a SaveFileDialog

 

The following code demonstrates how to create a SaveFileDialog:

from textual.app import App from textual.app import App, ComposeResult from textual_cogs.dialogs import SaveFileDialog class DialogApp(App): def on_mount(self) -> ComposeResult: self.push_screen(SaveFileDialog()) if __name__ == "__main__": app = DialogApp() app.run()

When you run this code, you will see the following:

Wrapping Up

The textual-cogs package is currently only a collection of reusable dialogs for your Textual application. However, this can help speed up your ability to add code to your TUI applications because the dialogs are taken care of for you.

Check it out on GitHub or the Python Package Index today.

The post ANN – The textual-cogs Package – Creating Reusable Dialogs for Textual appeared first on Mouse Vs Python.

Categories: FLOSS Project Planets

qtatech.com blog: Managing Multilingual Content in Drupal 10 Multisites

Planet Drupal - Wed, 2024-11-13 07:51
Managing Multilingual Content in Drupal 10 Multisites kanapatrick Wed, 11/13/2024 - 13:51

In an increasingly globalized world, businesses are turning to multilingual solutions to reach an international audience. Drupal 10 offers a powerful multisite architecture that allows you to manage multiple sites from a single installation, ideal for organizations with a global reach.

Categories: FLOSS Project Planets

Droptica: 5 Problems You May Encounter When Integrating Drupal with Third-Party Software

Planet Drupal - Wed, 2024-11-13 07:22

Integrating Drupal with other systems is a common part of creating or developing a website or web application. Although Drupal offers many tools to facilitate this process, encountering minor or major difficulties is simply inevitable. Based on our knowledge from several hundred projects for clients, we’ve compiled a list of the common problems. It’s worth familiarizing yourself with them to effectively avoid them and speed up the implementation of integration projects.

Categories: FLOSS Project Planets

Real Python: Quiz: Basic Input and Output in Python

Planet Python - Wed, 2024-11-13 07:00

In this quiz, you’ll test your understanding of how to use Python’s built-in functions input() and print() for basic input and output operations.

You’ll also revisit how to use readline to improve the user experience when collecting input, and how to format output using the sep and end keyword arguments of print().

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Russell Coker: Modern Sleep

Planet Debian - Wed, 2024-11-13 05:10

Julius wrote an insightful blog post about the “modern sleep” issue with Windows [1]. Basically Microsoft decided that the right way to run laptops is to never entirely sleep, which uses more battery but gives better options for waking up and doing things. I agree with Microsoft in concept and this is something that is a problem that can be solved. A phone can run for 24+ hours without ever fully sleeping, a laptop has a more power hungry CPU and peripherals but also has a much larger battery so it should be able to do the same. Some of the reviews for Snapdragon Windows laptops claim up to 22 hours of actual work without charging! So having suspend not really stop the system should be fine.

The ability of a phone to never fully sleep is a change in quality of the usage experience, it means that you can access it and immediately have it respond and it means that all manner of services can be checked for new updates which may require a notification to the user. The XMPP protocol (AKA Jabber) was invented in 1999 which was before laptops were common and Instant Message systems were common long before then. But using Jabber or another IM system on a desktop was a very different experience to using it on a laptop and using it on a phone is different again. The “modern sleep” allows laptops to act like phones in regard to such messaging services. Currently I have Matrix IM clients running on my Android phone and Linux laptop, if I get a notification that takes much typing for a response then I get out my laptop to respond. If I had an ARM based laptop that never fully shut down I would have much less need for Matrix on a phone.

Making “modern sleep” popular will lead to more development of OS software to work with it. For Linux this will hopefully mean that regular Linux distributions (as opposed to Android which while running a Linux kernel is very different to Debian etc) get better support for such things and therefore become more usable on phones. Debian on a Librem 5 or PinePhonePro isn’t very usable due to battery life issues.

A laptop with an LTE card can be used for full mobile phone functionality. With “modern sleep” this is a viable option. I am tempted to make a laptop with LTE card and bluetooth headset a replacement for my phone. Some people will say “what if someone tries to call you when it’s not convenient to have your laptop with you”, my response is “what if people learn to not expect me to answer the phone at any time as they managed that in the 90s”. Seriously SMS or Matrix me if you want an instant response and if you want a long chat schedule it via SMS or Matrix.

Dell has some useful advice about how to use their laptops (and probably most laptops from recent times) in this regard [2]. You can’t close the lid before unplugging the power cable you have to unplug first and then close. You shouldn’t put a laptop in a sealed bag for travel either. This is a terrible situation, you can put a tablet in a bag and don’t need to take any special precautions when unplugging and laptops should work the same. The end result of what Microsoft, Dell, Intel, and others are doing will be good but they are making some silly design choices along the way! I blame Intel mostly for selling laptop CPUs with TDPs >40W!

For an amusing take on this Linus Tech Tips has a video about being forced to use MacBooks by Microsoft’s implementation of Modern Sleep [3].

I’ll try out some ARM laptops in the near future and blog about how well they work on Debian.

Related posts:

  1. Modern Laptops Suck One of the reasons why I’m moving from a laptop...
  2. Cooling a Thinkpad Late last year I wrote about the way that modern...
  3. Cheap Laptops for Children I was recently browsing an electronics store and noticed some...
Categories: FLOSS Project Planets

Pages