FLOSS Project Planets
Web Review, Week 2024-49
Let’s go for my web review for the week 2024-49.
Pourquoi les médias devraient créer des serveurs Mastodon maintenantTags: tech, social-media, fediverse, bluesky, politics, business
Article in French
Very good piece explaining why the Ferdiverse is currently our only option for a decentralized social media platform. Maybe Bluesky will become another option… maybe… but so far it’s only empty promises with a real risk of capture.
Tags: tech, ai, machine-learning, gpt, copyright, law
Another lawsuit making progress against OpenAI and their shady practice.
Tags: tech, security, ai, machine-learning, gpt
Let’s hope security teams don’t get saturated with low quality security reports like this…
https://sethmlarson.dev/slop-security-reports
Tags: tech, ai, machine-learning, vision
Nice vision model. Looks like it strikes and interesting balance between performance and memory consumption. Looks doable to run cheaply and on premise.
https://huggingface.co/blog/smolvlm
Tags: tech, foss, hardware, networking
This is an excellent milestone reached for the OpenWrt project. Easily available hardware is a must. It’s rather cheap too.
https://sfconservancy.org/news/2024/nov/29/openwrt-one-wireless-router-now-ships-black-friday/
Tags: tech, unix, system
Good post about the very much overlooked fact that lots of command buffer internally when their output is not a TTY.
https://jvns.ca/blog/2024/11/29/why-pipes-get-stuck-buffering/
Tags: tech, compiler, gpu, research
Interesting research about feasibility of making compilers parallelized on the GPU. I wonder how far this will go.
https://dl.acm.org/doi/abs/10.1145⁄3528416.3530249
Tags: tech, c++, safety
Interesting piece, it highlights well the struggle for the C++ community to come up with a cohesive approach to improve safety. It doesn’t look like the solution is going to come from the standardization committee (unfortunately).
https://cor3ntin.github.io/posts/profiles/
Tags: tech, c++
Very nice improvements finally coming to structured bindings indeed. Should make them even more useful.
https://biowpn.github.io/bioweapon/2024/12/03/structured-bindings-cpp26.html
Tags: tech, language, benchmarking
Comparing languages based on some benchmark is probably a fool’s errand indeed. To many factors can change between language and benchmark implementations.
https://www.gingerbill.org/article/2024/01/22/comparing-language-benchmarks/
Tags: tech, web, frontend, react, criticism, product-management, performance
Excellent piece which shows why React (or Angular) is almost always a bad choice and that you’d be better off banking on the underlying web platform. It leads to better user experience full stop. The article also goes in great length debunking the claims which keep React dominant.
https://infrequently.org/2024/11/if-not-react-then-what/
Tags: tech, engineering, management
Nice example of organization to foster more autonomy and ownership in engineering teams. Clearly needs to be adapted to the project context but gives quite a few ideas. It strikes a nice balance at keeping both an individual and a team view of the responsibilities.
https://candost.blog/strong-ownership-culture-in-a-team/
Tags: tech, project-management, risk
Excellent article introducing how to analyse risks.
https://jacobian.org/2024/dec/4/risk-introduction/
Tags: tech, engineering, business, metrics
Good mulling for thought. It’s always a bit challenging to nicely explain the tie between engineering metrics and how they impact the business. This is a nice starting point.
https://icchasethi.medium.com/tying-engineering-metrics-to-business-metrics-f4df7651e026
Bye for now!
The Drop Times: Transforming Digital Luxury Experience: Aqua Expeditions’ Website Revamp Journey
PyCon: PyCon US 2025 Registration Launch!
The news you’ve been waiting for is finally here - registration for PyCon US 2025 is officially open!
PyCon US will take place May 14 - May 22, 2025, in Pittsburgh, Pennsylvania at the David L. Lawrence Convention Center. The core of the conference, May 16 - May 18, 2025, packs in three days worth of our community’s best talks, amazing keynote speakers, and our famed lightning talks to close out each day— but it is much more than that!
It’s gathering together with the members of our community, to learn from, share with, and connect. It’s joining a conversation in the hallway with the creators of open source projects. It’s taking yourself from beginner to intermediate; intermediate to advanced; or advanced to cutting edge. For some, it’s getting started with Python for the first time. We have loads of exciting plans in the works for this year, and we can’t wait to spend this special time with you!
How to RegisterOnce you have created an account on the PyCon US 2025 conference website, you can register via the registration button on your dashboard. Head over to our Registration Information page to get all the details on how to register.
Early Bird Registration RatesPyCon US is providing discounted rates for Corporate, Student, and Individual tickets for the first 500 tickets sold during the first 30 days that registration is open. Don’t wait, register now to receive your discount!- Early Bird Corporate - $750
- Early Bird Individual - $400
- Early Bird Student - $100
Regular rates will go into effect once early bird tickets sell out or after January 6, 2025.
Regular Registration Rates- Corporate - $800 USD
- Individual - $450 USD
- Student - $125 USD
PyCon US is committed to protecting the health and safety of our community. To ensure that we are gathering safely, we have implemented updated guidelines and protocols to be followed by all attendees during the event. We ask that you please review these guidelines prior to registration.
To support a safe environment and enjoyable experience for all, PyCon US attendees are also required as always to comply with our Code of Conduct, which you can review here.
T-shirts & PyLadies AuctionConference T-shirts and tickets to the PyLades Auction are not yet available but will be released in the coming weeks. Keep an eye out on the PyCon US 2025 website to be one of the first to know and grab yours while supplies last!
TutorialsTutorials will be presented on Wednesday, May 14, 2025, and Thursday, May 15, 2025. We are accepting proposals for tutorials through December 19, 2024. Find more information on how to submit your proposal via our website and our CfP platform. Once our program committee has scheduled the selected tutorials, you will be able to add them to your conference registration.
Watch for tutorial registration launch in February 2025. Opt-in for PyCon US News and follow us on Twitter, Mastodon, and the PSF LinkedIn for the announcement.
Sponsorship and Sponsor PresentationsSponsorship for PyCon US 2025 is open now, and you can see the details of our sponsorship options and apply directly on our Sponsorship Application page. We’re grateful to all of our sponsors, who make PyCon US possible.
For those interested in a paid speaking opportunity, Sponsor Presentations will take place Thursday, May 15, 2025. To reserve a slot for an hour-long Sponsor Presentation on the topic of your choice, please apply for Partner Level Sponsorship or higher and select the check mark next to “Sponsor Presentation.” Slots are limited and typically sell out, so please submit your request soon. Contact sponsors@python.org with any questions.
HotelsPyCon US has contracted special rates with nearby hotels. When you complete your registration for PyCon US 2025, you will be able to book a hotel reservation on your dashboard through our official housing bureau, Orchid Events. Booking through Orchid helps support PyCon US and it is the only way to get the conference rates, so book now while supplies last!
More information can be found on the Hotels page.
Note: Beware of Housing Pirates! PyCon US or Orchid Events will not be calling delegates to sell rooms. If you are contacted by an agency other than Orchid Events offering to make your hotel reservations, we urge you to not use their services. We cannot protect you against them if you do book a reservation.
Travel Grants - Applications now Open!Check out the Travel Grant page to learn more about the support we provide for travel, hotel, and registration to ensure that everyone has an opportunity to attend PyCon US. We actively encourage people to apply for travel grants and welcome applications from any attendees who otherwise would not be able to attend. Our goal is to support diversity and provide equity to attendees and attract Python developers at all experience levels from around the world. For questions about the application process, visit the Travel Grant FAQ page.
Cancellation FeesRegistration cancellations must be submitted in writing to pycon-reg@python.org and received by May 1, 2025, in order to receive a refund minus the $50 cancellation fee ($25 for students; waived for cancellation due to health reasons). No refunds will be granted for cancellations received after May 1, 2025, unless you must cancel for any health-related reasons (see more details in the Health & Safety Guidelines). In lieu of cancellation, you have the option to transfer your registration to another person. For details about transferring your registration, visit the registration page.
Call for Proposals - Deadline December 19th!There’s still time to submit your proposal to present a Talk, Charla, Poster, or Tutorial at PyCon US! More information on our website and on our CfP platform.
Community Booths - Applications now Open!Each year, we set aside booth space in the Expo Hall for nonprofit organizations and community open source projects that serve the Python community and the broader open source ecosystem. If that describes your organization or group, we’d love for you to apply for one of our complimentary Community Booths. Visit this page for more details.
Real Python: The Real Python Podcast – Episode #231: Good Python Programming Practices When New to the Language
What advice would you give to someone moving from another language to Python? What good programming practices are inherent to the language? Christopher Trudeau is back on the show this week, bringing another batch of PyCoder's Weekly articles and projects.
[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
Drupal Mountain Camp: EMPOWERING DIVERSITY: INCLUSION FUNDS FOR UNDERREPRESENTED VOICES
We are excited to introduce an initiative that not only reflects Drupal Mountain Camp's commitment to diversity, but also actively encourages the participation of underrepresented voices to promote inclusivity and diversity in the Drupal community. Open source communities were founded on the principle that diversity and varied perspectives strengthen the entire ecosystem. We are committed to providing opportunities for individuals from underrepresented groups to attend this year's event, share their experiences, and contribute to the thriving Drupal community.
What are Diversity Inclusion Funds?
Diversity and Inclusion funds offer complimentary tickets and financial support to facilitate individuals to participate in the Drupal Mountain Camp.
How to Apply: Eligibility Criteria:
To qualify, applicants should be a member of one of the following communities and are experiencing financial/economic hardship:
- Ability
- Age
- Ethnicity
- Gender
- Gender identity
- Race
- Religion
- Sexual orientation
- Socio-economic status/class
- Learning differences
- Family composition
Prospective applicants can request diversity tickets by filling out an application form found on the Drupal Mountain Camp website. The application requires basic information and a concise statement explaining the potential benefits of receiving the funds. The organizing committee will assess applications, considering eligibility criteria and the availability of funds. The application deadline for the Diversity and Inclusion Fund is January 31st.
How to Sponsor Diversity Tickets:Sponsoring diversity tickets is an opportunity for organizations and individuals to actively contribute to a more inclusive Drupal community. By sponsoring a diversity ticket and inclusion funds, you become a part of the bridge, enabling someone to participate who might otherwise face barriers.
Sponsorship as an Organization:Drupal Mountain Camp offers sponsorship opportunities for Diversity & Inclusion sponsors to contribute to free diversity tickets and funds aimed at encouraging diverse participation at the event. Sponsors will receive recognition during the event, showcasing their commitment to diversity and inclusion. Any remaining funds will be allocated towards the seamless organization of the event. Please check out the Sponsorship page for more details and contact information.
Sponsorship as an Individual:Drupal Mountain Camp extends the opportunity for individual sponsorship of Diversity tickets. You can buy one or more "Sponsor a Diversity Ticket" on the eventfrog.ch website. Your contribution will directly support the provision of complimentary tickets to foster diversity within the community.
Join us on this exciting journey, building bridges that strengthen the Drupal community, we invite you to join us in making Drupal Mountain Camp 2025 an unforgettable celebration of unity in diversity. Together, let's create an environment where everyone feels not only welcome but integral to the shared narrative of our community.
Golems GABB: Drupal Starshot
Ready for a Drupal revolution? At a recent keynote, Dries Buytaert, the mastermind behind Drupal, introduced Drupal Starshot, a groundbreaking initiative set to make building websites with Drupal incredibly simple.
"Drupal has always been known for its low-code capabilities. However, many competitors now offer similar features, and in some areas, they even surpass what Drupal provides. While Drupal is celebrated for its robustness, it can be challenging for newcomers, especially those with limited technical expertise. So in my keynote, I was excited to introduce Drupal Starshot, our "Moonshot" to make Drupal more accessible and easier to use." — said Dries.
Droptica: How to Quickly Create a Website for the Manufacturing Industry? Using Droopler
Creating a website for a manufacturing company requires thoughtful planning and customization of functionality to meet the needs of different user groups. Drupal, as a flexible CMS, offers tools to quickly and efficiently build web pages tailored to market requirements. In this article, we’ll discuss step-by-step how to create a modern and functional website for a manufacturing company using Drupal and its Droopler distribution.
LostCarPark Drupal Blog: Drupal Advent Calendar day 6 - Live Preview
With the launch of Drupal CMS will be a new trial experience, making it easier than ever for non-technical evaluators to try Drupal, without needing to set up a local environment or any special tools.
I spoke with Matt Glaman, who was leading the Live Preview track in the initial development stage.
Matt has done a lot of amazing work to allow a webserver running Drupal to run directly in the web browser on your computer. How does this work? Matt took PHP, along with a webserver and database engine, and compiled them into WebAssembly, which is a language that runs in the browser, a bit like…
TagsMorpht: Unleashing the power of Metatag custom tags
Reproducible Builds (diffoscope): diffoscope 284 released
The diffoscope maintainers are pleased to announce the release of diffoscope version 284. This version includes the following changes:
[ Chris Lamb ] * Simplify tests_quines.py::test_{differences,differences_deb} to use assert_diff and not mangle the expected test output. * Update some tests to support file(1) version 5.46. (Closes: reproducible-builds/diffoscope#395)You find out more by visiting the project homepage.
icecream!
Lots of KDE hacking these days, and that comes with compiling large amounts of code. Right now, I am installing, well building from source Plasma Mobile on an “old” laptop so I can test some patches natively on a touchscreen device. The machine has just two cores (hyperthreaded), so builds take rather long, especially if you build Qt and all that 80+ packages that are needed for a fully working Plasma system.
One of the tools that do an incredible job while being super flexible to use is icecream. Icecream (or “icecc“) allows you to distribute your build over multiple machines, it basically ships compile-jobs with all that’s needed to other machines on a local network, meaning you can parallelize your builds.
Icecream has this nice visualization tool, called icecream-monitor which you can stare at while your builds are running (in case you don’t have anyone handy for a sword-fight). In the screenshot you can see manta, the underpowered laptop doing a 32 parallel job build over the network. miro is my heavy workstation, 8 cores and 128GB of RAM, it duely gets the bulk of the work assigned, frame is my (Framework) laptop, which is also quite beefy, gets something to do too, but not taxed as heavily as that build monster in my basement office.
Icecream can be used with most environments that have you run your compiler locally. Different distros are no problem! Just a matching CPU architecture is needed. Icecream does its job by providing its own g++ and gcc binaries, which will relay the build jobs transparently to either your local machine or across the network. So you basically install it, adjust your PATH variable to make sure icecc’s g++ is found before your system’s compiler and start your build. Other machines you want to join in for the fun just need to run icecc-scheduler and they will be automatically discovered as build slaves on your network. If you want to further speed up builds, it works with ccache as well.
Please note that you only want to do this in a trusted environment, we’re shipping executables around the network without authorization!
Christian Ledermann: Trusted publishing ‐ It has never been easier to publish your python packages
Publishing Python packages used to be a daunting task, but not any more. Even better, it has become significantly more secure. Gone are the days of juggling usernames, passwords, or API tokens while relying on CLI tools. With trusted publishing, you simply provide PyPI with the details of your GitHub repository, and GitHub Actions takes care of the heavy lifting.
How to Publish Your Python Package with Trusted PublishingI will introduce a workflow that will publish your package to TestPyPi when a tag is created (on the development branch), or to PyPi when you merge to the main branch.
Prepare Your Package for PublishingEnsure your Python package follows PyPI’s packaging guidelines. At a minimum, you’ll need:
- A setup.py or pyproject.toml file defining your package metadata.
- Properly structured code with a clear directory layout.
- A README file to showcase your project on PyPI.
For a detailed checklist, refer to the Python Packaging User Guide.
Configure GitHub Actions in Your RepositoryLet's start by creating a new GitHub action .github/workflows/test-build-publish.yml.
This action will build your package and uploads the built wheel and the source distribution (SDist) as GitHub Actions artefacts.
Next, we add a step to publish to TestPyPI. This step will run whenever a tag is created, ensuring that the build from the previous step has completed successfully. Replace PROJECT_OWNER and PROJECT_NAME with the appropriate values for your repository.
This step downloads the artefacts created during the build process and uploads them to TestPyPI for testing.
In the last step, we will upload the package to PyPI when a pull request is merged into the main branch.
To ensure that only specific tags trigger the publishing workflow and maintain control over your release process.
Create a new environment test-release by navigating to Settings -> Environments in your GitHub repository.
Set up the environment and add a deployment tag rule.
Limit which branches and tags can deploy to this environment based on rules or naming patterns.
Limit which branches and tags can deploy to this environment based on naming patterns.
Configure the target tags.
The pattern [0-9]*.[0-9]*.[0-9]* matches semantic versioning tags such as 1.2.3, 0.1.0, or 2.5.1b3, but it excludes arbitrary tags like bugfix-567 or feature-update.
Repeat this for the release environment to protect the main branch in the same way, but this time targeting the main branch.
Set Up a PyPI Project and Link Your GitHub RepositoryCreate an account on TestPyPI if you don’t have one.
Navigate to your account, Publishing and add a new pending publisher.
Link your GitHub repository to the PyPI project by providing its name, your GitHub username, the repository name, the workflow name (test-build-publish.yml) and the environment name (test-release).
Repeat the above on PyPI with the environment name set to release.
Test the WorkflowNow whenever you create a tag on your development branch, it will trigger a release to be uploaded to TestPyPI and merging the development branch into main will upload a release to PyPI.
What Wasn't CoveredWhile this guide provides an introduction to trusted publishing workflows, there are additional steps and best practices you might consider implementing. For example, setting up branch protection rules can ensure only authorized collaborators can push tags or merge to protected branches, like main or develop. You can also enforce status checks or require pull request reviews before merging, adding another layer of quality assurance.
Have a look at my python-repository-template that covers additional enhancement to this workflow, such as requiring unit and static tests to pass, checking the package with pyroma and ensuring that your tag matches the version of your package with vercheck.
SummaryIf you've been holding back on sharing your work, now is the perfect time to try trusted publishing.
- Introducing 'Trusted Publishers' The Python Package Index Blog highlights a more secure publishing method that does not require long-lived passwords or API tokens to be shared with external systems
- Publishing to PyPI with a Trusted Publisher The official PyPI documentation to get started with using trusted publishers on PyPI.
- Building and testing Python in the official GitHub docs.
Luke Plant: Check if a point is in a cylinder - geometry and code
In my current project I’m doing a fair amount of geometry, and one small problem I needed to solve a while back was finding whether a point is inside a cylinder.
The accepted answer for this on math.stackexchange.com wasn’t ideal — part of it was very over-complicated, and also didn’t work under some circumstances. So I contributed my own answer. In this post, in addition to the maths, I’ll give an implementation in Python.
MethodWe can solve this problem by constructing the cylinder negatively:
Start with an infinite space
Throw out everything that isn't within the cylinder.
This is a classic mathematician’s approach, but it works great here, and it also works pretty well for any simply-connected solid object with only straight or concave surfaces, depending on how complex those surfaces are.
First some definitions:
Our cylinder is defined by two points, A and B, and a radius R
The point we want to test is P
The vectors from the origin to points A, B and P are \(\boldsymbol{r}_A\), \(\boldsymbol{r}_B\) and \(\boldsymbol{r}_P\) respectively.
We start with the infinite space, that is we assume all points are within the cylinder until we show they aren’t.
Then we construct 3 cuts to exclude certain values of \(\boldsymbol{r}_P\).
First, a cylindrical cut of radius R about an infinite line that goes through A and B. (This was taken from John Alexiou's answer in the link above):
The vector from point A to point B is:
\begin{equation*} \boldsymbol{e} = \boldsymbol{r}_B-\boldsymbol{r}_A \end{equation*}This defines the direction of the line through A and B.
The distance from any point P at vector \(\boldsymbol{r}_P\) to the line is:
\begin{equation*} d = \frac{\| \boldsymbol{e}\times\left(\boldsymbol{r}_{P}-\boldsymbol{r}_{A}\right) \|}{\|\boldsymbol{e}\|} \end{equation*}This is based on finding the distance of a point to a line, and using point A as an arbitrary point on the line. We could equally have used B.
We then simply exclude all points with \(d > R\).
This can be optimised slightly by squaring both sides of the comparison to avoid two square root operations.
Second, a planar cut that throws away the space above the "top" of the cylinder, which I'm calling A.
The plane is defined by any point on it (A will do), and any normal pointing out of the cylinder, \(-\boldsymbol{e}\) will do (i.e. in the opposite direction to \(\boldsymbol{e}\) as defined above).
We can see which side a point is of this plane as per Relation between a point and a plane:
The point is "above" the plane (in the direction of the normal) if:
\begin{equation*} (\boldsymbol{r}_P - \boldsymbol{r}_A) \cdot -\boldsymbol{e} > 0 \end{equation*}We exclude points which match the above.
Third, a planar cut which throws away the space below the bottom of the cylinder B.
This is the same as the previous step, but with the other end of the cylinder and the normal vector in the other direction, so the condition is:
\begin{equation*} (\boldsymbol{r}_P - \boldsymbol{r}_B) \cdot \boldsymbol{e} > 0 \end{equation*}
Below is a minimal implementation with zero dependencies outside the standard lib, in which there are just enough classes, with just enough methods, to express the algorithm neatly, following the above steps exactly.
In a real implementation:
You might separate out a Point class as being semantically different from Vec
You could move some functions to be methods (in my real implementation, I have a Cylinder.contains_point() method, for example)
You should probably use @dataclass(frozen=True) – immutable objects are a good default, I didn’t use them here because there aren’t needed and I’m focusing on clarity of the code.
Conversely, if performance is more of a consideration, and you don’t have a more general need for classes like Vec and Cylinder:
you might use a more efficient representation, such as a tuple or list, or a numpy array especially if you wanted bulk operations.
you could use more generic dot product functions etc. from numpy.
you might inline more of the algorithm into a single function.
For clarity, I also have not implemented the optimisation mentioned in which you can avoid doing some square root operations.
from __future__ import annotations import math from dataclasses import dataclass # Implementation of https://math.stackexchange.com/questions/3518495/check-if-a-general-point-is-inside-a-given-cylinder # See the accompanying blog post http://lukeplant.me.uk/blog/posts/check-if-a-point-is-in-a-cylinder-geometry-and-code/ # -- Main algorithm -- def cylinder_contains_point(cylinder: Cylinder, point: Vec) -> bool: # First condition: distance from axis cylinder_direction: Vec = cylinder.end - cylinder.start point_distance_from_axis: float = abs( cross_product( cylinder_direction, (point - cylinder.start), ) ) / abs(cylinder_direction) if point_distance_from_axis > cylinder.radius: return False # Second condition: point must lie below the top plane. # Third condition: point must lie above the bottom plane # We construct planes with normals pointing out of the cylinder at both # ends, and exclude points that are outside ("above") either plane. start_plane = Plane(cylinder.start, -cylinder_direction) if point_is_above_plane(point, start_plane): return False end_plane = Plane(cylinder.end, cylinder_direction) if point_is_above_plane(point, end_plane): return False return True # -- Supporting classes and functions -- @dataclass class Vec: """ A Vector in 3 dimensions, also used to represent points in space """ x: float y: float z: float def __add__(self, other: Vec) -> Vec: return Vec(self.x + other.x, self.y + other.y, self.z + other.z) def __sub__(self, other: Vec) -> Vec: return self + (-other) def __neg__(self) -> Vec: return -1 * self def __mul__(self, scalar: float) -> Vec: return Vec(self.x * scalar, self.y * scalar, self.z * scalar) def __rmul__(self, scalar: float) -> Vec: return self * scalar def __abs__(self) -> float: return math.sqrt(self.x**2 + self.y**2 + self.z**2) @dataclass class Plane: """ A plane defined by a point on the plane, `origin`, and a `normal` vector to the plane. """ origin: Vec normal: Vec @dataclass class Cylinder: """ A closed cylinder defined by start and end points along the center line and a radius """ start: Vec end: Vec radius: float def cross_product(a: Vec, b: Vec) -> Vec: return Vec( a.y * b.z - a.z * b.y, a.z * b.x - a.x * b.z, a.x * b.y - a.y * b.x, ) def dot_product(a: Vec, b: Vec) -> float: return a.x * b.x + a.y * b.y + a.z * b.z def point_is_above_plane(point: Vec, plane: Plane) -> bool: """ Returns True if `point` is above the plane — that is on the side of the plane which is in the direction of the plane `normal`. """ # See https://math.stackexchange.com/a/2998886/78071 return dot_product((point - plane.origin), plane.normal) > 0 # -- Tests -- def test_cylinder_contains_point(): # Test cases constructed with help of Geogebra - https://www.geogebra.org/calculator/tnc3arfm cylinder = Cylinder(start=Vec(1, 0, 0), end=Vec(6.196, 3, 0), radius=0.5) # In the Z plane: assert cylinder_contains_point(cylinder, Vec(1.02, 0, 0)) assert not cylinder_contains_point(cylinder, Vec(0.98, 0, 0)) # outside bottom plane assert cylinder_contains_point(cylinder, Vec(0.8, 0.4, 0)) assert not cylinder_contains_point(cylinder, Vec(0.8, 0.5, 0)) # too far from center assert not cylinder_contains_point(cylinder, Vec(0.8, 0.3, 0)) # outside bottom plane assert cylinder_contains_point(cylinder, Vec(1.4, -0.3, 0)) assert not cylinder_contains_point(cylinder, Vec(1.4, -0.4, 0)) # too far from center assert cylinder_contains_point(cylinder, Vec(6.2, 2.8, 0)) assert not cylinder_contains_point(cylinder, Vec(6.2, 2.2, 0)) # too far from center assert not cylinder_contains_point(cylinder, Vec(6.2, 3.2, 0)) # outside top plane # Away from Z plane assert cylinder_contains_point(cylinder, Vec(1.02, 0, 0.2)) assert not cylinder_contains_point(cylinder, Vec(1.02, 0, 1)) # too far from center assert not cylinder_contains_point(cylinder, Vec(0.8, 0.3, 2)) # too far from center, and outside bottom planeImageX: Women in Drupal 2024: Exclusive Interview with Award Winner Alla Petrovska from Our Team
It’s a great blessing to work alongside outstanding women and an enormous joy to see their efforts and contributions recognized.
EuroPython Society: EPS Board 2024-2025
We’re happy to announce our new board for the 2024-2025 term:
- Anders Hammarquist
- Aris Nivorils
- Artur Czepiel (Chair)
- Cyril Bitterich
- Ege Akman
- Mia Bajić (Vice Chair)
- Shekhar Koirala
You can read more about them in their nomination post at https://www.europython-society.org/list-of-eps-board-candidates-for-2024-2025/. The minutes and the video recording of the General Assembly 2024 will be published soon.
Together, we will continue to serve the community and head off to the preparations for EuroPython 2025!
Freelock Blog: Automatically tag articles
Today, another automation using the Drupal #AI module -- automatically tag your articles.
With the AI module, its AI Automators submodule, and a provider configured, you can add an automation to any field. With a Tag field on your content, you can edit the field definition and "Enable AI Automator". Give it a reasonable prompt, and it will tag your content for you.
Like most of the AI integrations, the great thing is you can easily edit the tags later if you want to highlight something specific, or if it comes up with something inappropriate.
Drupal Association blog: One month until Drupal 7 End of Life on 5 January 2025!
As you’ve most likely heard already, Drupal 7's End of Life is fast approaching. Drupal 7 security support is ending one month from today on 5 January 2025 – fourteen years to the day that Drupal 7 was originally released! If you are still running Drupal 7 beyond this date, your website will be vulnerable to security risks and may face compatibility issues.
With only one month left until Drupal 7 security support ends, now is the time to wrap up any final preparations to get your site ready in time for the end of life. While migrating from Drupal 7 may seem daunting, the Drupal Association is here to assure you that the process can be smooth from start to finish with sources like our Drupal 7 migration resource page.
Hopefully, you have started your plan for life after Drupal 7, but if you are looking for direction, here are some paths you can take:
Extended Security Support for Drupal 7If you plan on keeping your site running on Drupal 7, check out our Drupal 7 Extended Security Support Program, if you haven’t already.
Our D7 extended security support partners are ready to provide you with the support that you need!
Migration partnersOn top of engaging an extended support partner, the Drupal Association has also created a list of certified migration partners for sites of all sizes. These partners can help you with your content strategy, audit your existing site, and help you through every step of the migration process to upgrade your site to modern Drupal.
Check out those who have already migrated!Many folks have already migrated from Drupal 7, and you can find their stories on our case studies page.
Not migrating? Be sure to communicate with your users!If migrating to a newer version or using extended support isn't feasible right now, it’s essential to notify your users of your security strategy. Make sure to inform your customers, managers, CISO, or other stakeholders about your plans for handling support and managing potential vulnerabilities. This transparency is important for maintaining trust and compliance!
ComputerMinds.co.uk: DDEV, solr, and platform.sh
We use platform.sh to host many of our client Drupal sites, and many of those sites use Solr.
Platform.sh has great documentation for setting up Solr to work with Drupal, and it essentially boils down to something like this in a services.yaml file:
solr94: type: solr:9.4 disk: 1024 configuration: cores: main: conf_dir: !archive "solr_9.x_config/" endpoints: main: core: mainAnd then a directory of Solr config.
Platform.sh then gives you a nice PHP library that can wire up your configuration into Drupal's settings.php like this:
$platformsh->registerFormatter('drupal-solr', function($solr) { // Default the solr core name to `collection1` for pre-Solr-6.x instances. return [ 'core' => substr($solr['path'], 5) ? : 'collection1', 'path' => '', 'host' => $solr['host'], 'port' => $solr['port'], ]; }); // The Solr relationship name (from .platform.app.yaml) $relationship_name = 'solrsearch'; // The machine name of the server in your Drupal configuration. $solr_server_name = 'solr_server'; if ($platformsh->hasRelationship($relationship_name)) { // Set the connector configuration to the appropriate value, as defined by the formatter above. $config['search_api.server.' . $solr_server_name]['backend_config']['connector_config'] = $platformsh->formattedCredentials($relationship_name, 'drupal-solr'); }All nice and simple, and works really well in platform.sh etc.
DDEVWe wanted our DDEV based development environments to match the platform.sh approach as closely as possible, specifically we really didn't want to have two versions of the Solr config, and we didn't want to have one process for DDEV and another for platform.sh.
We are targeting Solr 9, so we are able to use the fantastic ddev/ddev-solr add-on that does the hard work of getting a Solr container running etc.
The add-on has the option of sending the Solr config (which Drupal can generate) to the Solr server, but then that config isn't the same as the config used by platform.sh. So we wanted to use the option where we make a core from existing set of config on disk. To do this we need a couple of things:
First, we make a 'placeholder' configset by creating a file at .ddev/solr/configsets/main/README.md with the following contents:
This directory will get mounted over the top of by the config from the .platform/solr_9.x_config directory at the root of this repository. See: docker-compose.cm-ddev-match-platform.yamlAnd then we need a file .ddev/docker-compose.cm-ddev-match-platform.yaml with the following contents:
services: solr: volumes: # We sneakily mount our platform.sh config into the place that ddev-solr wants it. - ../.platform/solr_9.x_config:/mnt/ddev_config/solr/configsets/mainAs the comment in the file says, this means that the same config is both used by platform.sh and the DDEV solr server for the main core.
Finally, we add a bit of a settings.php config for developers running DDEV:
// Hardcode the settings for the Solr config to match our ddev config. // The machine name of the server in your Drupal configuration: 'solr_server'. $config['search_api.server.solr_server']['backend_config']['connector'] = 'basic_auth'; $config['search_api.server.solr_server']['backend_config']['connector_config']['host'] = 'solr'; $config['search_api.server.solr_server']['backend_config']['connector_config']['username'] = 'solr'; $config['search_api.server.solr_server']['backend_config']['connector_config']['password'] = 'SolrRocks'; $config['search_api.server.solr_server']['backend_config']['connector_config']['core'] = 'main';And that's it! So simple, thanks DDEV!
Reproducible Builds: Reproducible Builds in November 2024
Welcome to the November 2024 report from the Reproducible Builds project!
Our monthly reports outline what we’ve been up to over the past month and highlight items of news from elsewhere in the world of software supply-chain security where relevant. As ever, if you are interested in contributing to the Reproducible Builds project, please visit our Contribute page on our website.
Table of contents:
- Reproducible Builds mourns the passing of Lunar
- Introducing reproduce.debian.net
- New landing page design
- SBOMs for Python packages
- Debian updates
- Reproducible builds by default in Maven 4
- PyPI now supports digital attestations
- “Dependency Challenges in OSS Package Registries”
- Zig programming language demonstrated reproducible
- Website updates
- Upstream patches
- Misc development news
- Reproducibility testing framework
The Reproducible Builds community sadly announced it has lost its founding member, Lunar. Jérémy Bobbio aka ‘Lunar’ passed away on Friday November 8th in palliative care in Rennes, France.
Lunar was instrumental in starting the Reproducible Builds project in 2013 as a loose initiative within the Debian project. He was the author of our earliest status reports and many of our key tools in use today are based on his design. Lunar’s creativity, insight and kindness were often noted.
You can view our full tribute elsewhere on our website. He will be greatly missed.
In happier news, this month saw the introduction of reproduce.debian.net. Announced at the recent Debian MiniDebConf in Toulouse, reproduce.debian.net is an instance of rebuilderd operated by the Reproducible Builds project.
rebuilderd is our server designed monitor the official package repositories of Linux distributions and attempts to reproduce the observed results there.
In November, reproduce.debian.net began rebuilding Debian unstable on the amd64 architecture, but throughout the MiniDebConf, it had attempted to rebuild 66% of the official archive. From this, it could be determined that it is currently possible to bit-for-bit reproduce and corroborate approximately 78% of the actual binaries distributed by Debian — that is, using the .buildinfo files hosted by Debian itself.
reproduce.debian.net also contains instructions how to setup one’s own rebuilderd instance, and we very much invite everyone with a machine to spare to setup their own version and to share the results. Whilst rebuilderd is still in development, it has been used to reproduce Arch Linux since 2019. We are especially looking for installations targeting Debian architectures other than i386 and amd64.
As part of a very productive partnership with the Sovereign Tech Fund and Neighbourhoodie, we are pleased to unveil our new homepage/landing page.
We are very happy with our collaboration with both STF and Neighbourhoodie (including many changes not directly related to the website), and look forward to working with them in the future.
SBOMs for Python packagesThe Python Software Foundation has announced a new “cross-functional project for SBOMs and Python packages”. Seth Michael Larson writes that the project is “specifically looking to solve these issues”:
- Enable Python users that require SBOM documents (likely due to regulations like CRA or SSDF) to self-serve using existing SBOM generation tools.
- Solve the “phantom dependency” problem, where non-Python software is bundled in Python packages but not recorded in any metadata. This makes the job of software composition analysis (SCA) tools difficult or impossible.
- Make the adoption work by relevant projects such as build backends, auditwheel-esque tools, as minimal as possible. Empower users who are interested in having better SBOM data for the Python projects they are using to be able to contribute engineering time towards that goal.
A GitHub repository for the initiative is available, and there are a number of queries, comments and remarks on Seth’s Discourse forum post.
There was significant development within Debian this month. Firstly, at the recent MiniDebConf in Toulouse, France, Holger Levsen gave a Debian-specific talk on rebuilding packages distributed from ftp.debian.org — that is to say, how to reproduce the results from the official Debian build servers:
Holger described the talk as follows:
For more than ten years, the Reproducible Builds project has worked towards reproducible builds of many projects, and for ten years now we have build Debian packages twice—with maximal variations applied—to see if they can be build reproducible still.
Since about a month, we’ve also been rebuilding trying to exactly match the builds being distributed via ftp.debian.org. This talk will describe the setup and the lessons learned so far, and why the results currently are what they are (spoiler: they are less than 30% reproducible), and what we can do to fix that.
The Debian Project Leader, Andreas Tille, was present at the talk and remarked later in his Bits from the DPL update that:
It might be unfair to single out a specific talk from Toulouse, but I’d like to highlight the one on reproducible builds. Beyond its technical focus, the talk also addressed the recent loss of Lunar, whom we mourn deeply. It served as a tribute to Lunar’s contributions and legacy. Personally, I’ve encountered packages maintained by Lunar and bugs he had filed. I believe that taking over his packages and addressing the bugs he reported is a meaningful way to honor his memory and acknowledge the value of his work.
Holger’s slides and video in .webm format are available.
Next, rebuilderd is the server to monitor package repositories of Linux distributions and attempt to reproduce the observed results. This month, version 0.21.0 released, most notably with improved support for binNMUs by Jochen Sprickerhof and updating the rebuilderd-debian.sh integration to the latest debrebuild version by Holger Levsen. There has also been significant work to get the rebuilderd package into the Debian archive, in particular, both rust-rebuilderd-common version 0.20.0-1 and rust-rust-lzma version 0.6.0-1 were packaged by kpcyrd and uploaded by Holger Levsen.
Related to this, Holger Levsen submitted three additional issues against rebuilderd as well:
- rebuildctl should be more verbose when encountering issues. […]
- Please add an option to used randomised queues. […]
- Scheduling and re-scheduling multiple packages at once. […]
… and lastly, Jochen Sprickerhof submitted one an issue requested that rebuilderd downloads the source package in addition to the .buildinfo file […] and kpcyrd also submitted and fixed an issue surrounding dependencies and clarifying the license […]
Separate to this, back in 2018, Chris Lamb filed a bug report against the sphinx-gallery package as it generates unreproducible content in various ways. This month, however, Dmitry Shachnev finally closed the bug, listing the multiple sub-issues that were part of the problem and how they were resolved.
Elsewhere, Roland Clobus posted to our mailing list this month, asking for input on a bug in Debian’s ca-certificates-java package. The issue is that the Java key management tools embed timestamps in its output, and this output ends up in the /etc/ssl/certs/java/cacerts file on the generated ISO images. A discussion resulted from Roland’s post suggesting some short- and medium-term solutions to the problem.
Holger Levsen uploaded some packages with reproducibility-related changes:
-
devscripts versions 2.24.3, 2.24.4 and 2.24.5 were uploaded, including several fixes for the debrebuild and debootsnap and scripts.
-
cdbs version 0.4.167 uploaded in order to drop dh_buildinfo support, as dpkg has generated .buildinfo files since 2016 and the results of dh_buildinfo are typically unreproducible. Related to this a mass bug filing by Helmut Grohne intended to remove the obsolete and deprecated dh-buildinfo package from the archive. At the time of writing, this still affects 311 packages in Debian unstable.
Lastly, 12 reviews of Debian packages were added, 5 were updated and 21 were removed this month adding to our knowledge about identified issues in Debian.
On our mailing list this month, Hervé Boutemy reported the latest release of Maven (4.0.0-beta-5) has reproducible builds enabled by default. In his mailing list post, Hervé mentions that “this story started during our Reproducible Builds summit in Hamburg”, where he created the upstream issue that builds on a “multi-year” effort to have Maven builds configured for reproducibility.
Elsewhere in the Python ecosystem and as reported on LWN and elsewhere, the Python Package Index (PyPI) has announced that it has finalised support for PEP 740 (“Index support for digital attestations”).
Trail of Bits, who performed much of the development work, has an in-depth blog post about the work and its adoption, as well as what is left undone:
One thing is notably missing from all of this work: downstream verification. […]
This isn’t an acceptable end state (cryptographic attestations have defensive properties only insofar as they’re actually verified), so we’re looking into ways to bring verification to individual installing clients. In particular, we’re currently working on a plugin architecture for pip that will enable users to load verification logic directly into their pip install flows.
There was an in-depth discussion on LWN’s announcement page, as well as on Hacker News.
At BENEVOL, the Belgium-Netherlands Software Evolution workshop in Namur, Belgium, Tom Mens and Alexandre Decan presented their paper, “An Overview and Catalogue of Dependency Challenges in Open Source Software Package Registries”.
The abstract of their paper is as follows:
While open-source software has enabled significant levels of reuse to speed up software development, it has also given rise to the dreadful dependency hell that all software practitioners face on a regular basis. This article provides a catalogue of dependency-related challenges that come with relying on OSS packages or libraries. The catalogue is based on the scientific literature on empirical research that has been conducted to understand, quantify and overcome these challenges. […]
A PDF of the paper is available online.
Motiejus Jakšty posted an interesting and practical blog post on his successful attempt to reproduce the Zig programming language without using the pre-compiled binaries checked into the repository, and despite the circular dependency inherent in its bootstrapping process.
As a summary, Motiejus concludes that:
I can now confidently say (and you can also check, you don’t need to trust me) that there is nothing hiding in zig1.wasm [the checked-in binary] that hasn’t been checked-in as a source file.
The full post is full of practical details, and includes a few open questions.
Notwithstanding the significant change to the landing page (screenshot above), there were an enormous number of changes made to our website this month. This included:
-
Alex Feyerke and Mariano Giménez:
-
Bernhard M. Wiedemann:
- Update the “System images” page to document the e2fsprogs approach. […]
-
Chris Lamb:
- Cachebust every CSS file per-release. […]
- Replace some inline markdown with HTML. […]
- Use spaces on the “Publications” page. […]
- Add a news article about the passing of Lunar. […][…][…][…]
- Add a black memorial band to the top of the page. […]
-
FC (Fay) Stegerman:
- Replace more inline markdown with HTML on the “Success stories” page. […]
- Add some links, fix some other links and correct some spelling errors on the “Tools” page. […]
-
Holger Levsen:
-
Julia Krüger:
- Add a new “Stripping of unreproducible information page to the documentation. […]
-
Ninette Adhikari & hulkoba:
-
Philip Rinn:
- Import 47 historical weekly reports. […]
-
hulkoba:
- Add alt text to almost all images (!). […][…]
- Fix a number of links on the “Talks”. […][…]
- Avoid so-called ‘ghost’ buttons by not using <button> elements as links, as the affordance of a <button> implies an action with (potentially) a side effect. […][…]
- Center the sponsor logos on the homepage. […]
-
Move publications and generate them instead from a data.yml file with an improved layout. […][…]
-
Make a large number of small but impactful stylisting changes. […][…][…][…]
- Expand the “Tools” to include a number of missing tools, fix some styling issues and fix a number of stale/broken links. […][…][…][…][…][…]
The Reproducible Builds project detects, dissects and attempts to fix as many currently-unreproducible packages as possible. We endeavour to send all of our patches upstream where appropriate. This month, we wrote a large number of such patches, including:
-
Bernhard M. Wiedemann:
- clisp (fix contributed by Bruno Haible)
- conky (date-related issue)
- emacs-auctex (date-related gzip issue)
- javadoc (filesystem ordering issue)
- jboss-websocket-1.0-api (embeds uname -r)
- lcms2 (CPU issue)
- LiE (ASLR-related issue)
- make_ext4fs (toolchain-related issue for for VM images)
- obs-build (issue when running builds with certain CPU types or core numbers)
- perl-Time-modules (fails to build far in the future)
- python-bson (fails to build far in the future)
- python-exiv2 (fails to build far in the future)
- python-moto (date-related gzip issue)
- python-pyhanko-certvalidator (fails to build far in the future)
- python-python-gvm (concurrency-related issue)
- python310 (fails to build far in the future)
- python313 (fails to build far in the future)
- reproducible-faketools (toolchain for emacs)
- shadowsocks-rust (date-related issue)
- swipl (fails to build far in the future)
-
Chris Lamb:
- #1087330 filed against python-pydash.
- #1087485 filed against fritzconnection.
- #1087486 filed against tracy.
- #1088238 filed against rust-broot.
- #1088353 filed against python-aiovlc.
- #1088742 filed against python-aiohomekit.
-
James Addison:
-
Bernhard M. Wiedemann published another report for the openSUSE distribution.
-
Martin Abente Lahaye updated diffoscope to fix a crash when objdump is missing. […]
-
On our mailing list, Jan-Benedict Glaw announced the publication of the fifth NetBSD Reproducibility Report
The Reproducible Builds project operates a comprehensive testing framework running primarily at tests.reproducible-builds.org in order to check packages and other artifacts for reproducibility. In November, a number of changes were made by Holger Levsen, including:
-
reproduce.debian.net-related changes:
- Create and introduce a new reproduce.debian.net service and subdomain […]
- Make a large number of documentation changes relevant to rebuilderd. […][…][…][…][…]
- Explain a temporary workaround for a specific issue in rebuilderd. […]
- Setup another rebuilderd instance on the o4 node and update installation documentation to match. […][…]
- Make a number of helpful/cosmetic changes to the interface, such as clarifying terms and adding links. […][…][…][…][…]
- Deploy configuration to the /opt and /var directories. […][…]
- Add an infancy (or ‘alpha’) disclaimer. […][…]
- Add more notes to the temporary rebuilderd documentation. […]
- Commit an nginx configuration file for reproduce.debian.net’s “Stats” page. […]
- Commit a rebuilder-worker.conf configuration for the o5 node. […]
-
Debian-related changes:
-
Misc changes:
- Adapt the update_jdn.sh script for new Debian trixie systems. […]
- Stop installing the PostgreSQL database engine on the o4 and o5 nodes. […]
- Prevent accidental reboots of the o4 node because of a long-running job owned by josch. […][…]
In addition, Mattia Rizzolo addressed a number of issues with reproduce.debian.net […][…][…][…]. And lastly, both Holger Levsen […][…][…][…] and Vagrant Cascadian […][…][…][…] performed node maintenance.
If you are interested in contributing to the Reproducible Builds project, please visit our Contribute page on our website. However, you can get in touch with us via:
-
IRC: #reproducible-builds on irc.oftc.net.
-
Mastodon: @reproducible_builds@fosstodon.org
-
Mailing list: rb-general@lists.reproducible-builds.org
-
Twitter: @ReproBuilds
Hardware Shenanigans
(originally titled “On Dead Trees”)
There’s features that you know are really important to some of our users but you frankly don’t really care for them much yourself. Printing is one such example. Recently, I actually had to print lots of paperwork, so I had a reason to fix some of my more pressing issues with our Print Manager.
Print jobs right at your finger tipThe biggest regression from the Plasma 4 days, when we moved from individual System Tray popups to a unified square view, was that Print Manager had to give up its two pane layout that showed the print queue directly in the popup. In order to view and cancel print jobs, you now had to select the printer and open its print queue window, and close it again after you’re done.
Unfortunately, with printer management, there’s really two opposing use cases: a home computer with maybe a couple of printers, and the office use case of hundreds of remote printers across several buildings. Picking one side usually leaves the other one worse off. However, I did not want to spent too much time on this, so in order to fix my workflow, I simply added the list of print jobs in the expanded view. I then added a busy indicator to a printer when it’s printing to make it easier to find in the list.
CUPS error messages have never been very nice and with all that “driver-less” stuff the user experience seems to have become worse, spitting technical gibberish like “cfFilterChain: Ghost script (PID 123456)” at the user. While printing probably works better now, the overall feature set has definitely regressed for me. In order to accommodate status messages better, Print Manager now shows up to three lines of text, which is particularly important in case of a printer or network error.
Just pretend the pile of paper at the top of the laser printer is actually the printed pagesAnother nice little touch from Plasma 4 was a dedicated laser printer icon. At home I have a black and white laser printer for printing documents and a color inkjet for printing pictures. It’s really nice being able to tell them apart at a glance. Therefore, I added a laser printer icon to Breeze as well. However, when I investigated how it worked, I found it just assumes every black and white printer to be a laser printer. Fair enough. You can ask CUPS for the “marker type”, e.g. toner or ink, and I hoped that I could use it to determine the printer type more accurately. Alas, since updating to the Ubuntu 24.04 base, none of my printers show ink levels anymore, not even after installing the official vendor drivers. Either way, ink status has been hit or miss for me under Linux for as long as I can remember, sometimes randomly working when talking to the printer over the network but not on the computer it was plugged into via USB and so on.
Next, while tinkering with printer settings, I noticed the nice little search box we have in our settings dialogs nowadays. Trying to find a certain option, I was surprised it didn’t highlight it, even though I clearly typed the exact name on the label. You see, controls in Qt can have “mnemonics” or “accelerators”, this is the underlined letter you typically see on a button that tells you what Alt+key to use to trigger it. The letter is prefixed with an ampersand (&) in the string, so “Pap&er size” shows as “Paper size” and will trigger on pressing Alt+E. KDE applications automatically assign a free accelerators to most widgets unless explicitly provided through the ampersand notation. The settings search did not account for this and subsequently failed to find it.
Leaving the subject of printers for now, I made a few minor improvements regarding batteries. One of my earliest contributions to Plasma’s power management system was actually a notification when a peripheral device, such as mouse or keyboard, runs low on battery. While the notification showed a dedicated icon for headsets (i.e. headphones with a microphone) it did not provide one for regular headphones, and neither did battery monitor, but it was an easy fix.
Unlike a mouse or keyboard, it’s probably not as bad that it may turn off at any timeAdditionally, when switching output devices, a brief on screen display is shown. In case of Bluetooth devices, battery status is included alongside the device name, to quickly see when the headphones you just connected are almost out of juice. When I switched to PipeWire this stopped working, no battery percentage was shown. I didn’t fully understand how it works but with PulseAudio it probably has exclusive access to the Bluetooth device and is the one that has to read the battery information and provide it to others as audio device property. With PipeWire, I guess things are different, and I just get to read battery information over the regular BlueZ battery interface, so that’s what Plasma will consult before showing the device popup.
Finally, the Energy Information page now displays the number of charge cycles your laptop battery has experienced so far, if available, in addition to the capacity estimation (“battery health”) it already showed. The ability to query this information was added to Solid, KDE’s Hardware Abstraction Framework, and is supported by all of its backends. My trusty ThinkPad has over 700 charge cycles now and still reports 77% capacity left. I was still quite happy with its battery life during this year’s Akademy – admittedly I didn’t compile much during talks and had the screen brightness very low.
If you like what you saw and want to support the KDE Community and enable the good people behind it to create the best software possible, please consider donating to our Year End Fundraiser! KDE is funded mainly by you: our friends, users, and supporters. Thanks to your donations, we can deliver the best free and open software that respects your privacy and gives you control over your devices and digital life.