Feeds

Russ Allbery: Review: The Faithless

Planet Debian - Sun, 2024-01-07 22:47

Review: The Faithless, by C.L. Clark

Series: Magic of the Lost #2 Publisher: Orbit Copyright: March 2023 ISBN: 0-316-54283-0 Format: Kindle Pages: 527

The Faithless is the second book in a political fantasy series that seems likely to be a trilogy. It is a direct sequel to The Unbroken, which you should read first. As usual, Orbit made it unnecessarily hard to get re-immersed in the world by refusing to provide memory aids for readers who read books as they come out instead of only when the series is complete, but this is not the fault of Clark or the book and you've heard me rant about this before.

The Unbroken was set in Qazāl (not-Algeria). The Faithless, as readers of the first book might guess from the title, is set in Balladaire (not-France). This is the palace intrigue book. Princess Luca is fighting for her throne against her uncle, the regent. Touraine is trying to represent her people. Whether and to what extent those interests are aligned is much of the meat of this book.

Normally I enjoy palace intrigue novels for the competence porn: watching someone navigate a complex political situation with skill and cunning, or upend the entire system by building unlikely coalitions or using unexpected routes to power. If you are similar, be warned that this is not what you're going to get. Touraine is a fish out of water with no idea how to navigate the Balladairan court, and does not magically become an expert in the course of this novel. Luca has the knowledge, but she's unsure, conflicted, and largely out-maneuvered. That means you will have to brace for some painful scenes of some of the worst people apparently getting what they want.

Despite that, I could not put this down. It was infuriating, frustrating, and a much slower burn than I prefer, but the layers of complex motivations that Clark builds up provided a different sort of payoff.

Two books in, the shape of this series is becoming clearer. This series is about empire and colonialism, but with considerably more complexity than fantasy normally brings to that topic. Power does not loosen its grasp easily, and it has numerous tools for subtle punishment after apparent upstart victories. Righteous causes rarely call banners to your side; instead, they create opportunities for other people to maneuver to their own advantage. Touraine has some amount of power now, but it's far from obvious how to use it. Her life's training tells her that exercising power will only cause trouble, and her enemies are more than happy to reinforce that message at every opportunity.

Most notable to me is Clark's bitingly honest portrayal of the supposed allies within the colonial power. It is clear that Luca is attempting to take the most ethical actions as she defines them, but it's remarkable how those efforts inevitably imply that Touraine should help Luca now in exchange for Luca's tenuous and less-defined possible future aid. This is not even a lie; it may be an accurate summary of Balladairan politics. And yet, somehow what Balladaire needs always matters more than the needs of their abused colony.

Underscoring this, Clark introduces another faction in the form of a populist movement against the Balladairan monarchy. The details of that setup in another fantasy novel would make them allies of the Qazāl. Here, as is so often the case in real life, a substantial portion of the populists are even more xenophobic and racist than the nobility. There are no easy alliances.

The trump card that Qazāl holds is magic. They have it, and (for reasons explored in The Unbroken) Balladaire needs it, although that is a position held by Luca's faction and not by her uncle. But even Luca wants to reduce that magic to a manageable technology, like any other element of the Balladairan state. She wants to understand it, harness it, and bring it under local control. Touraine, trained by Balladaire and facing Balladairan political problems, has the same tendency. The magic, at least in this book, refuses — not in the flashy, rebellious way that it would in most fantasy, but in a frustrating and incomprehensible lack of predictable or convenient rules. I think this will feel like a plot device to some readers, and that is to some extent true, but I think I see glimmers of Clark setting up a conflict of world views that will play out in the third book.

I think some people are going to bounce off this book. It's frustrating, enraging, at times melodramatic, and does not offer the cathartic payoff typically offered in fantasy novels of this type. Usually these are things I would be complaining about as well. And yet, I found it satisfyingly challenging, engrossing, and memorable. I spent a lot of the book yelling "just kill him already" at the characters, but I think one of Clark's points is that overcoming colonial relationships requires a lot more than just killing one evil man. The characters profoundly fail to execute some clever and victorious strategy. Instead, as in the first book, they muddle through, making the best choice that they can see in each moment, making lots of mistakes, and paying heavy prices. It's realistic in a way that has nothing to do with blood or violence or grittiness. (Although I did appreciate having the thin thread of Pruett's story and its highly satisfying conclusion.)

This is also a slow-burn romance, and there too I think opinions will differ. Touraine and Luca keep circling back to the same arguments and the same frustrations, and there were times that this felt repetitive. It also adds a lot of personal drama to the politics in a way that occasionally made me dubious. But here too, I think Clark is partly using the romance to illustrate the deeper political points.

Luca is often insufferable, cruel and ambitious in ways she doesn't realize, and only vaguely able to understand the Qazāl perspective; in short, she's the pragmatic centrist reformer. I am dubious that her ethics would lead her to anything other than endless compromise without Touraine to push her. To Luca's credit, she also realizes that and wants to be a better person, but struggles to have the courage to act on it. Touraine both does and does not want to manipulate her; she wants Luca's help (and more), but it's not clear Luca will give it under acceptable terms, or even understand how much she's demanding. It's that foundational conflict that turns the romance into a slow burn by pushing them apart. Apparently I have more patience for this type of on-again, off-again relationship than one based on artificial miscommunication.

The more I noticed the political subtext, the more engaging I found the romance on the surface.

I picked this up because I'd read several books about black characters written by white authors, and while there was nothing that wrong with those books, the politics felt a little too reductionist and simplified. I wanted a book that was going to force me out of comfortable political assumptions. The Faithless did exactly what I was looking for, and I am definitely here for the rest of the series. In that sense, recommended, although do not go into this book hoping for adroit court maneuvering and competence porn.

Followed by The Sovereign, which does not yet have a release date.

Content warnings: Child death, attempted cultural genocide.

Rating: 7 out of 10

Categories: FLOSS Project Planets

Python People: Will Vincent - Django, Writing Technical Books

Planet Python - Sun, 2024-01-07 20:21

Will Vincent is a former board member of the Django Software Foundation. He's written 3 books on Django, writes a django newsletter, is a podcast co-host for Django Chat.


★ Support this podcast while learning ★

The Complete pytest Course, is the best way to learn pytest quickly.

  • Python testing with pytest from beginner through advanced.
  • Covers applying pytest to projects, including continuous integration and reporting.
  • Even covers combining features to great effect and effectiveness.
★ Support this podcast on Patreon ★ <p>Will Vincent is a former board member of the Django Software Foundation. He's written 3 books on Django, writes a django newsletter, is a podcast co-host for Django Chat.</p> <br><p><strong>★ Support this podcast while learning ★</strong></p><p><a href="https://courses.pythontest.com/p/complete-pytest-course">The Complete pytest Course</a>, is the best way to learn pytest quickly.</p><ul><li>Python testing with pytest from beginner through advanced.</li><li>Covers applying pytest to projects, including continuous integration and reporting.</li><li>Even covers combining features to great effect and effectiveness.</li></ul> <strong> <a href="https://www.patreon.com/PythonPeople" rel="payment" title="★ Support this podcast on Patreon ★">★ Support this podcast on Patreon ★</a> </strong>
Categories: FLOSS Project Planets

Jonathan McDowell: Free Software Activities for 2023

Planet Debian - Sun, 2024-01-07 13:34

This year was hard from a personal and work point of view, which impacted the amount of Free Software bits I ended up doing - even when I had the time I often wasn’t in the right head space to make progress on things. However writing this annual recap up has been a useful exercise, as I achieved more than I realised. For previous years see 2019, 2020, 2021 + 2022.

Conferences

The only Free Software related conference I made it to this year was DebConf23 in Kochi, India. Changes with projects at work meant I couldn’t justify anything work related. This year I’m planning to make it to FOSDEM, and haven’t made a decision on DebConf24 yet.

Debian

Most of my contributions to Free software continue to happen within Debian.

I started the year working on retrogaming with Kodi on Debian. I got this to a much better state for bookworm, with it being possible to run the bsnes-mercury emulator under Kodi using RetroArch. There are a few other libretro backends available for RetroArch, but Kodi needs some extra controller mappings packaged up first.

Plenty of uploads were involved, though some of this was aligning all the dependencies and generally cleaning things up in iterations.

I continued to work on a few packages within the Debian Electronics Packaging Team. OpenOCD produced a new release in time for the bookworm release, so I uploaded 0.12.0-1. There were a few minor sigrok cleanups - sigrok 0.3, libsigrokdecode 0.5.3-4 + libsigrok 0.5.2-4 / 0.5.2-5.

While I didn’t manage to get the work completed I did some renaming of the ESP8266 related packages - gcc-xtensa-lx106 (which saw a 13 upload pre-bookworm) has become gcc-xtensa (with 14) and binutils-xtensa-lx106 has become binutils-xtensa (with 6). Binary packages remain the same, but this is intended to allow for the generation of ESP32 compiler toolchains from the same source.

onak saw 0.6.3-1 uploaded to match the upstream release. I also uploaded libgpg-error 1.47-1 (though I can claim no credit for any of the work in preparing the package) to help move things forward on updating gnupg2 in Debian.

I NMUed tpm2-pkcs11 1.9.0-0.1 to fix some minor issues pre-bookworm release; I use this package myself to store my SSH key within my laptop TPM, so I care about it being in a decent state.

sg3-utils also saw a bit of love with 1.46-2 + 1.46-3 - I don’t work in the storage space these days, but I’m still listed as an uploaded and there was an RC bug around the library package naming that I was qualified to fix and test pre-bookworm.

Related to my retroarch work I sponsored uploads of mgba for Ryan Tandy: 0.10.0+dfsg-1, 0.10.0+dfsg-2, 0.10.1+dfsg-1, 0.10.2+dfsg-1, mgba 0.10.1+dfsg-1+deb12u1.

As part of the Data Protection Team I responded to various inbound queries to that team, both from project members and those external to the project.

I continue to keep an eye on Debian New Members, even though I’m mostly inactive as an application manager - we generally seem to have enough available recently. Mostly my involvement is via Front Desk activities, helping out with queries to the team alias, and contributing to internal discussions as well as our panel at DebConf23.

Finally the 3 month rotation for Debian Keyring continues to operate smoothly. I dealt with 2023.03.24, 2023.06.26, 2023.06.29, 2023.09.10, 2023.09.24 + 2023.12.24.

Linux

I had a few minor patches accepted to the kernel this year. A pair of safexcel cleanups (improved error logging for firmware load fail and cleanup on load failure) came out of upgrading the kernel running on my RB5009.

The rest were related to my work on repurposing my C.H.I.P.. The AXP209 driver needed extended to support GPIO3 (with associated DT schema update). That allowed Bluetooth to be enabled. Adding the AXP209 internal temperature ADC as an iio-hwmon node means it can be tracked using the normal sensor monitoring framework. And finally I added the pinmux settings for mmc2, which I use to support an external microSD slot on my C.H.I.P.

Personal projects

2023 saw another minor release of onak, 0.6.3, which resulted in a corresponding Debian upload (0.6.3-1). It has a couple of bug fixes (including a particularly annoying, if minor, one around systemd socket activation that felt very satisfying to get to the bottom of), but I still lack the time to do any of the major changes I would like to.

I wrote listadmin3 to allow easy manipulation of moderation queues for Mailman3. It’s basic, but it’s drastically improved my timeliness on dealing with held messages.

Work related

This year only involved a single upstream related submission; a fix for tpm_tis interrupts with the Lenovo P620 that then got dropped when the change that caused the issue was reverted.

That wraps up 2023. I’ve got no particular goals for this year; looking around my desk I’ve a few ARM based devices I’d like to get running a mainline kernel. I need to play about a bit more with the retroarch bits (if I really had time I’d do the migration for Kodi to PCRE2, as that’s currently causing testing migration issues), perhaps getting some more controller mappings packaged. But no promises.

Categories: FLOSS Project Planets

Data School: What are conda, Anaconda, and Miniconda? 🐍

Planet Python - Sun, 2024-01-07 11:57

If you&aposve ever taken one of my data science courses, you&aposve probably noticed that I frequently recommend the Anaconda distribution of Python.

You might be left wondering:

  • What is the Anaconda distribution, and why do people recommend it?
  • How is it related to conda?
  • How is it related to Miniconda?
  • As a Data Scientist, which of these do I need to be familiar with?

I&aposll answer those questions below! &#x1F447;

What is Anaconda?

Anaconda is a Python distribution aimed at Data Scientists that includes 250+ packages (with easy access to 7,500+ additional packages). Its value proposition is that you can download it (for free) and "everything just works." It&aposs available for Mac, Windows, and Linux.

A new Anaconda distribution is released a few times a year. Within each distribution, the versions of the included packages have all been tested to work together.

If you visit the installation page for many data science packages (such as pandas), they recommend Anaconda because it makes installation easy!

What is conda?

conda is an open source package and environment manager that comes with Anaconda.

As a package manager, you can use conda to install, update, and remove packages and their "dependencies" (the packages they depend upon):

  • If Anaconda doesn&apost include a package that you need, you use conda to download and install it.
  • If Anaconda doesn&apost have the version of a package you need, you use conda to update it.

As an environment manager, you can use conda to manage virtual environments:

  • If you&aposre not familiar with virtual environments, they allow you to maintain isolated environments with different packages and versions of those packages.
  • conda is an alternative to virtualenv, pipenv, and other related tools.

conda has a few huge advantages over other tools:

  • It&aposs a single tool to learn, rather than using multiple tools to manage packages, environments, and Python versions.
  • Package installation is predictably easy because you&aposre installing pre-compiled binaries.
  • Unlike pip, you never need to build from source code, which can be especially difficult for some data science packages.
  • You can use conda with languages other than Python.
What is Miniconda?

Miniconda is a Python distribution that only includes Python, conda, their dependencies, and a few other useful packages.

Miniconda is a great choice if you prefer to only install the packages you need, and you&aposre sufficiently familiar with conda. (Here&aposs how to choose between Anaconda and Miniconda.)

Summary:
  • Anaconda and Miniconda are both Python distributions.
  • Anaconda includes hundreds of packages, whereas Miniconda includes just a few.
  • conda is an open source tool that comes with both Anaconda and Miniconda, and it functions as both a package manager and an environment manager.

Personally, I make extensive use of conda for creating environments and installing packages. And since I&aposm comfortable with conda, I much prefer Miniconda over Anaconda.

Do you have questions about conda, Anaconda, or Miniconda? Let me know in the comments section below! &#x1F447;

Categories: FLOSS Project Planets

November and December in KDE PIM

Planet KDE - Sun, 2024-01-07 07:00

Here's our bi-monthly update from KDE's personal information management applications team. This report covers progress made in the months of November and December 2023.

Since the last report, 35 people contributed approximately 1500 code changes, focusing on bugfixes and improvements for the coming 24.02 release based on Qt6.

Transition to Qt6/KDE Frameworks 6

As we plan for KDE Gear 24.02 to be our first Qt6-based release, there's been on-going work on more Qt6 porting, removing deprecated functions, removing support for the qmake build system, and so on.

New feature

Kontact, KMail, KAddressBook and Sieve Editor will now warn you when you are running an unsuported version of KDE PIM and it is time to update it. The code for this is based on Kdenlive.

Itinerary

While the focus for Itinerary was also on completing the transition to Qt 6, with its nightly Android builds switching over a week ago, there also were a number of new features added, such as public transport arrival search, a new journey display in the timeline and a nearby amenity search. See Itinerary's bi-monthly status update for more details.

KOrganizer

In KOrganizer it is no possible to search for events without specifying a time range (BKO#394967). A new configuration option was added to only notify about events that the user is organizer or attendee of. This is especially useful when users add a shared calendar from their colleagues, but don't want to be notified about events that they do not participate in. Korganizer intercepts SIGINT/SIGTERM signal. It will avoid to lose data during editing.

Akonadi

Work has begun on creating a tool that would help users to migrate their Akonadi database to a different database engine. Since we started improving SQLite support in Akonadi during the year, we received many requests from users about how they can easily switch from MySQL or PostgreSQL to SQLite without having to wipe their entire configuration and setup everything from scratch with SQLite. With the database migrator tool, it will be very easy for users to migrate between the supported database engines without having to re-configure and re-sync all their accounts afterwards.

Dan's work on this tool is funded by g10 Code GmbH.

KMail

The email composer received some small visual changes and adopted a more frameless look similar to other KDE applications. In addition, the composer will directly display whether the openPGP key of a recipient exists and is valid when encryption is enabled.

The rust based adblocker received further work and adblock lists can be added or removed.

We removed changing the charset of a message and now only support sending emails as UTF-8. This is nowaday supported by all the other email clients. But we still support reading emails in other encoding.

Now kmail intercepts SIGINT/SIGTERM signal. It avoids to close by error KMail during editing.

Laurent finished to implement the "allow to reopen closed viewer" feature. This provides a menu which allows to reopen message closed by error.

In addition a lot of bugs were fixed. This includes:

  • Fix bug 478352: Status bar has "white box" UI artifact (BKO#478352).
  • Load on demand specific widget in viewer. Reduce memory foot.
  • Fix mailfilter agent connection with kmail (fix apply filters)

A lot of work was done to rewrite the account wizard in QML. Accounts can now be created manually.

KAddressBook

Bug 478636 was fixed. Postal addresses was ruined when we added address in contact. (BKO#478636). Now KAddressbook intercepts SIGINT/SIGTERM signal. It avoids to close by error it during editing contact.

Kleopatra

Development focussed on fixing bugs for the release of GnuPG VS-Desktop® (which includes Kleopatra) in December.

  • OpenPGP keys with valid primary key and expired encryption or signing subkeys are now handled correctly (T6788).
  • Key groups containing expired keys are now handled correctly (T6742).
  • The currently selected certificate in the certificate list stays selected when opening the certificate details or certifying the certificate (T6360).
  • When creating an encrypted archive fails or is aborted, then Kleopatra makes sure that no partially created archive is left over (T6584).
  • Certificates on certain smart cards are now listed even if some of the certificates couldn't be loaded (T6830).
  • The root of a certificate chain with two certificates is no longer shown twice (T6807).
  • A crash when deleting a certificate that is part of a certificate chains with cycles was fixed (T6602).
  • Kleopatra now supports the special keyserver value "none" to disable keyserver lookups (T6761).
  • Kleopatra looks up a key for an email address via WKD (key directory provided by an email domain owner) even if keyserver lookups are disabled (T6868).
  • Signing and encryption of large files is now much faster (especially on Windows where it was much slower than on Linux) (T6351).
pim-sieve-editor

2 bugs were fixed in sieveeditor:

  • Fix BUG: 477755: Fix script name (BKO#477755).
  • Fix bug 476456: No scrollbar in simple editing mode (BKO#476456).

Now sieveeditor intercepts SIGINT/SIGTERM signal. It avoids to close by error it during editing sieve script.

pim-data-exporter

Now it intercepts SIGINT/SIGTERM signal. It avoids to close application during import/export.

Categories: FLOSS Project Planets

Valhalla's Things: A Corset or Two

Planet Debian - Sat, 2024-01-06 19:00
Posted on January 7, 2024
Tags: madeof:atoms, craft:sewing, period:victorian, FreeSoftWear

CW for body size change mentions

I needed a corset, badly.

Years ago I had a chance to have my measurements taken by a former professional corset maker and then a lesson in how to draft an underbust corset, and that lead to me learning how nice wearing a well-fitted corset feels.

Later I tried to extend that pattern up for a midbust corset, with success.

And then my body changed suddenly, and I was no longer able to wear either of those, and after a while I started missing them.

Since my body was still changing (if no longer drastically so), and I didn’t want to use expensive materials for something that had a risk of not fitting after too little time, I decided to start by making myself a summer lightweight corset in aida cloth and plastic boning (for which I had already bought materials). It fitted, but not as well as the first two ones, and I’ve worn it quite a bit.

I still wanted back the feeling of wearing a comfy, heavy contraption of coutil and steel, however.

After a lot of procrastination I redrafted a new pattern, scrapped everything, tried again, had my measurements taken by a dressmaker [#dressmaker], put them in the draft, cut a first mock-up in cheap cotton, fixed the position of a seam, did a second mock-up in denim [#jeans] from an old pair of jeans, and then cut into the cheap herringbone coutil I was planning to use.

And that’s when I went to see which one of the busks in my stash would work, and realized that I had used a wrong vertical measurement and the front of the corset was way too long for a midbust corset.

Luckily I also had a few longer busks, I basted one to the denim mock up and tried to wear it for a few hours, to see if it was too long to be comfortable. It was just a bit, on the bottom, which could be easily fixed with the Power Tools1.

Except, the more I looked at it the more doing this felt wrong: what I needed most was a midbust corset, not an overbust one, which is what this was starting to be.

I could have trimmed it down, but I knew that I also wanted this corset to be a wearable mockup for the pattern, to refine it and have it available for more corsets. And I still had more than half of the cheap coutil I was using, so I decided to redo the pattern and cut new panels.

And this is where the “or two” comes in: I’m not going to waste the overbust panels: I had been wanting to learn some techniques to make corsets with a fashion fabric layer, rather than just a single layer of coutil, and this looks like an excellent opportunity for that, together with a piece of purple silk that I know I have in the stash. This will happen later, however, first I’m giving priority to the underbust.

Anyway, a second set of panels was cut, all the seam lines marked with tailor tacks, and I started sewing by inserting the busk.

And then realized that the pre-made boning channel tape I had was too narrow for the 10 mm spiral steel I had plenty of. And that the 25 mm twill tape was also too narrow for a double boning channel. On the other hand, the 18 mm twill tape I had used for the waist tape was good for a single channel, so I decided to put a single bone on each seam, and then add another piece of boning in the middle of each panel.

Since I’m making external channels, making them in self fabric would have probably looked better, but I no longer had enough fabric, because of the cutting mishap, and anyway this is going to be a strictly underwear only corset, so it’s not a big deal.

Once the boning channel situation was taken care of, everything else proceeded quite smoothly and I was able to finish the corset during the Christmas break, enlisting again my SO to take care of the flat steel boning while I cut the spiral steels myself with wire cutters.

I could have been a bit more precise with the binding, as it doesn’t align precisely at the front edge, but then again, it’s underwear, nobody other than me and everybody who reads this post is going to see it and I was in a hurry to see it finished. I will be more careful with the next one.

I also think that I haven’t been careful enough when pressing the seams and applying the tape, and I’ve lost about a cm of width per part, so I’m using a lacing gap that is a bit wider than I planned for, but that may change as the corset gets worn, and is still within tolerance.

Also, on the morning after I had finished the corset I woke up and realized that I had forgotten to add garter tabs at the bottom edge. I don’t know whether I will ever use them, but I wanted the option, so maybe I’ll try to add them later on, especially if I can do it without undoing the binding.

The next step would have been flossing, which I proceeded to postpone until I’ve worn the corset for a while: not because there is any reason for it, but because I still don’t know how I want to do it :)

What was left was finishing and uploading the pattern and instructions, that are now on my sewing pattern website as #FreeSoftWear, and finally I could post this on the blog.

  1. i.e. by asking my SO to cut and sand it, because I’m lazy and I hate doing that part :D↩︎

Categories: FLOSS Project Planets

mailutils @ Savannah: GNU mailutils version 3.17

GNU Planet! - Sat, 2024-01-06 10:20

GNU mailutils version 3.17 is available for download. This is a maintenance release, including some new features:

Use of TLS in pop3d and imap4d


If not explicitly specified, the TLS mode to use (ondemand, connect, etc.) is derived from the configured port.  E.g., for imap4d, port 143 implies ondemand mode, and port 993 implies connection mode.

The global tls-mode setting is used only when the mode cannot be determined otherwise, i.e. neither per-server tls-mode is given nor the port gives any clues as to the TLS mode to use.

Categories: FLOSS Project Planets

anubis @ Savannah: GNU anubis version 4.3

GNU Planet! - Sat, 2024-01-06 06:42

GNU anubis version 4.3 is available for download. This is a maintenance release, including some new features:

anubisusr requires GnuTLS


New configuration statement: use-pam

 
Used in CONTROL section, this boolean statement enables or disables the use of the Pluggable Authentication Module interface for accounting and session management.

New configuration statement: identd-keyfile


Sets the name of the file with shared keys used for decrypting replies from the auth service.  It is used in traditional mode if anubis receives an encrypted response from the client's identd server (e.g. if they are running pidentd with encryption).

Categories: FLOSS Project Planets

37C3 Impressions

Planet KDE - Sat, 2024-01-06 06:15

A week has passed since I attended the 37th Chaos Communication Congress (37C3) in Hamburg, Germany, together with a bunch of other KDE people.

KDE Assembly

For the first time KDE had a larger presence, with a number of people and our own assembly. We also ended up hosting the Linux on Mobile assembly as their originally intended space didn’t materialize. The extra space demand was compensated by assimilating neighbouring tables of a group that didn’t show up for the most part.

KDE assembly featuring the event-typical colorful LEDs and sticker piles (photo by Victoria).

Just for the logistical convenience of having a meeting point and a place to store your stuff alone this was a big help, and it also made a number of people find us that we’d otherwise probably wouldn’t have met.

Talks

There was one KDE talk in the main program, Joseph’s Software Licensing For A Circular Economy covering some of the KDE Eco work. I’d estimate about 500 attendees, despite the “early” morning slot.

Beyond that I actually managed to attend very few talks, Breaking “DRM” in Polish trains being clearly one of my favorites.

Emergency and Weather Alerts

I got a chance to meet the author of FOSS Warn. FOSS Warn is a free alternative to the proprietary emergency and weather alert apps in Germany. That topic had originally motivated my work on UnifiedPush support for KDE, and an emergency and weather alert service that allows subscribing to areas of interest and receiving push notifications for new alerts, covering almost 100 countries.

The latter is a proof of concept demo at best though. The idea is to collaborate with the FOSS Warn team on a joint backend implementation, evolving this into something production-ready and usable for all of us.

While there’s still a couple of technicalities to resolve, after having met in person I’m very confident that this will all work out.

Open Transport Community

Other groups I’m involved with were present at 37C3 as well, like the Open Transport community, with the Bahnbubble Meetup being the event that brought everyone together.

Discussion topics included:

  • Identifying and getting access to missing data tables and documentations for DB and VDV ticket barcodes.
  • Helping others with implementing the ERA FCB ticket format.
  • Evolving the Transport API Repository.
  • International collaboration and networking between the Open Transport communities in Europe,
  • Integration between Itinerary and Träwelling.
  • Train station identifiers in Wikidata, including a recent property proposal for DB station numbers, as well as a yet to be written one for IFOPT identifiers.

Many more topics ended up being discussed in parallel, overall I’d say there’s more than enough content and interested people for its own dedicated event/conference on this. Until somebody organizes that there’s the open-transport Matrix channel and the Railways and Open Transport track at FOSDEM.

Indoor Localization and Routing

The OSM community was at 37C3 as well of course, and in that context I got one of the probably most unexpected contacts, by meeting one of the authors of simpleLoc. That’s an indoor navigation solution, which is one of the big open challenges in our indoor map view used in Itinerary.

Like other such solutions it’s unfortunately not fully available as Free Software, but the available information, components and published papers nevertheless turned out very valuable, in particular since this is using a localization approach that requires only commonly available smartphone sensors.

Much more immediately applicable were the hints on how they implemented indoor routing. Unlike for “outdoor” navigation graph-based algorithms are usually not an option, we need something that works on polygons in order to use OSM data as input directly. Existing solutions I encountered for that in a mapping/geography context were all proprietary unfortunately, but there’s a free library that solves that problem for 3D game engines: Recast Navigation.

While I’m not entirely sure yet how to map all relevant details to that (e.g. directional paths for escalators, tactile paving guides, etc), the initial experiments look very promising.

Routing through a complex indoor corridor.

There’s obviously a ton of work left, as this essentially requires mapping all relevant bits of OSM indoor data to a 3D model, and that’s at the very limit of what can be extracted from the OSM data and data model in many places.

KDE Outreach

While people working on technology make up a significant part of the audience of 37C3, there’s many more people from other areas attending as well, so this also was an opportunity for a bit of user research, following the pattern of the kde.org/for pages.

KDE for activists

On the KDE for Activists page we focus a lot on tools for secure and self-hosted infrastructure. The need for that seems like a given or even a hard requirement for people in that field, not something that needs to be argued for (the event itself might provide a certain selection bias for that though).

What we however need to improve on is making this much more robust and easy to setup and manage for people that might not be familiar with all the technical details, jargon and abbreviations we expose them to. Even worse, that can pose the risk of making dangerous mistakes, up to causing people physical harm.

KDE for FOSS projects

Meeting other FOSS developers is also interesting, in particular those not well connected to our usual bubble. That’s often single person projects, some of them even quite successful ones. Around those KDE is then the odd outlier, due to our size and choice of tools/infrastructure (ie. not Github).

Topics that typically come up then are handling of finances, legal risks/liability, shortcomings of Github, moderation of communication channels, etc., all things far less painful from the perspective of someone under the umbrella of a big organization like KDE.

I think there’s some interesting discussions to be had on how widely we want to extend the KDE umbrella and how we want to promote that, what other umbrella organizations there are and whether there are still uncovered gaps between those, and how to manage the scope and governance of such umbrella organizations.

It’s probably also worth talking more about what we already have and do in that regard, it’s not even clear to everyone apparently that joining a larger organization is even an option.

KDE for public administration

We got questions for pointers to material/support for doing medium-sized Linux/Plasma deployments in public administration. This is unfortunately something we don’t have anywhere in a well structured form currently I think.

It would seem very useful to have, beyond a KDE focus even. There’s people on the inside fighting for this, and while the upcoming Windows 11 induced large-scale hardware obsolescence is working in our favor there, the increasingly pervasive use of MS Teams is making a migration to FOSS infrastructure and workspaces much harder.

Conclusion

I had very high expectations for this after the experience at 36C3, but by day 3 this had exceeded all of them. The extreme breadth of people there is just unmatched, coming from FOSS/Open Data/Open Hardware projects tiny to large, public administration and infrastructure, education/universities, funding organizations, politics and lobbying, civil/social initiatives, you name it.

And all of that in a fun atmosphere that never stops to amaze. While walking down a corridor you might find yourself overtaken by a person driving a motorized couch, and if you have an urgent need for an electron microscope for whatever reason someone over in the other hall brought one just in case. And all of that is just “normal”, I could fill this entire post with anecdotes like that.

Ideas and plans for 38C3 were already discussed :)

Categories: FLOSS Project Planets

CodersLegacy: Switching between Multiple Screens in Tkinter (dynamically)

Planet Python - Sat, 2024-01-06 05:22

The following article demonstrates a useful trick in Tkinter to create multiple “screens” (e.g. a login screen, register screen, main page screen) within a single window! Instead of creating a new window for each screen, we will use a single tkinter window, which swaps dynamically between multiple “screens” (represented by frames).

This is more efficient and faster than creating a new window every time.

Complete Code:

The core idea is simple, we have multiple classes, each of which represents a window. These classes inherit from the Frame class, essentially making them frames as well. To “swap” between screens, we destroy the existing frame (including all of its children objects) and then create the new frame to take its place.

A YouTube video explaining this code step-by-step is included at the bottom of this article.

import tkinter as tk def center_window(width, height): x = (root.winfo_screenwidth() // 2) - (width // 2) y = (root.winfo_screenheight() // 2) - (height // 2) root.geometry(f'{width}x{height}+{x}+{y}') class WelcomeWindow(tk.Frame): def __init__(self, master): super().__init__() self.master = master self.master.title("Welcome") center_window(200, 150) login_button = tk.Button(self, text="Login", width=10, command = self.on_login) login_button.pack(padx=20, pady=(20, 10)) register_button = tk.Button(self, text="Register", width=10, command = self.on_register) register_button.pack(pady=10) self.pack() def on_login(self): for widget in self.winfo_children(): widget.destroy() self.destroy() LoginWindow(self.master) def on_register(self): for widget in self.winfo_children(): widget.destroy() self.destroy() RegisterWindow(self.master) class LoginWindow(tk.Frame): def __init__(self, master): super().__init__() self.master = master self.master.title("Login") self.master.resizable(False, False) center_window(250, 150) tk.Label(self, text="Username:").grid(row=0, column=0) self.username_entry = tk.Entry(self) self.username_entry.grid(row=0, column=1, padx=10, pady=10) tk.Label(self, text="Password:").grid(row=1, column=0) self.password_entry = tk.Entry(self, show="*") self.password_entry.grid(row=1, column=1, padx=10, pady=10) submit_button = tk.Button(self, text="Submit", width=8,command = self.on_successful_login) submit_button.grid(row=2, column=1, sticky="e", padx=10, pady=(10, 0)) submit_button = tk.Button(self, text="Back", width=8, command = self.on_back) submit_button.grid(row=2, column=0, sticky="w", padx=10, pady=(10, 0)) self.pack() def on_back(self): for widget in self.winfo_children(): widget.destroy() self.destroy() WelcomeWindow(self.master) def on_successful_login(self): for widget in self.winfo_children(): widget.destroy() self.destroy() MainWindow(self.master) class RegisterWindow(tk.Frame): def __init__(self, master): super().__init__() self.master = master self.master.title("Register") self.master.resizable(False, False) center_window(300, 250) tk.Label(self, text="First Name:").grid(row=0, column=0, sticky="w") self.first_name_entry = tk.Entry(self, width=26) self.first_name_entry.grid(row=0, column=1, padx=10, pady=10, sticky="e") tk.Label(self, text="Last Name:").grid(row=1, column=0, sticky="w") self.last_name_entry = tk.Entry(self, width=26) self.last_name_entry.grid(row=1, column=1, padx=10, pady=10, sticky="e") tk.Label(self, text="Password:").grid(row=2, column=0, sticky="w") self.password_entry = tk.Entry(self, show="*", width=26) self.password_entry.grid(row=2, column=1, padx=10, pady=10, sticky="e") tk.Label(self, text="Email:").grid(row=3, column=0, sticky="w") self.email_entry = tk.Entry(self, width=26) self.email_entry.grid(row=3, column=1, padx=10, pady=10, sticky="e") submit_button = tk.Button(self, text="Submit", width=8) submit_button.grid(row=7, column=1, padx=10, pady=10, sticky="e") submit_button = tk.Button(self, text="Back", width=8, command = self.on_back) submit_button.grid(row=7, column=0, sticky="w", padx=10, pady=(10, 10)) self.pack() def on_back(self): for widget in self.winfo_children(): widget.destroy() self.destroy() WelcomeWindow(self.master) class MainWindow(tk.Frame): def __init__(self, master): super().__init__() self.master = master center_window(500, 500) self.pack() root = tk.Tk() root.eval('tk::PlaceWindow . center') WelcomeWindow(root) root.mainloop()

This marks the end of the Switching between multiple Screens in Tkinter Tutorial. Any suggestions or contributions for CodersLegacy are more than welcome. Questions regarding the tutorial content can be asked in the comments section below.

The post Switching between Multiple Screens in Tkinter (dynamically) appeared first on CodersLegacy.

Categories: FLOSS Project Planets

eiriksm.dev: Could not open connection: unknown error: cannot find Chrome binary

Planet Drupal - Sat, 2024-01-06 01:49

I was just recently starting to get this on projects after (at least I think that's why) I updated to Chrome 120.

Could not open connection: unknown error: cannot find Chrome binary (Driver info: chromedriver=120.0.6099.71 (9729082fe6174c0a371fc66501f5efc5d69d3d2b-refs/branch-heads/6099_56@{#13}),platform=Linux 6.2.0-37-generic x86_64) (Behat\Mink\Exception\DriverException)

That's the error message in behat at least, which I think originates from the actual webdriver (chromedriver) response. If I look at the debug information from the chromedriver logs it says this:

[1703837483,910][INFO]: [0ca18bb59db30d5acd358de02a01da0a] RESPONSE InitSession ERROR unknown error: cannot find Chrome binary

Well the error is clear enough. It can not find the binary. That's fine by me, but where would I go about informing about the binary? Well, for me that would be in behat.yml:

@@ -33,8 +33,9 @@ default: w3c: false marionette: null chrome: + binary: /usr/bin/google-chrome switches:

This probably translates to something like this, while initiating the session with chromedriver (slightly edited for relevance and brevity):

[1703887172,675][INFO]: [95b2908582293fa560a7301661f5e741] COMMAND InitSession { "desiredCapabilities": { "chrome.binary": "/usr/bin/google-chrome", "chrome.extensions": [ ], "chrome.switches": [ "--ignore-certificate-errors", "--disable-gpu", "--no-sandbox", "--disable-dev-shm-usage" ], "goog:chromeOptions": { "args": [ "--ignore-certificate-errors", "--disable-gpu", "--no-sandbox", "--disable-dev-shm-usage" ], "binary": "/usr/bin/google-chrome", "extensions": [ ] }, "ignoreZoomSetting": false, "marionette": true, } }

If you are using some other tool that interacts with chromedriver, I am sure you are already setting some parameters there, which you could append this new parameter to.

Categories: FLOSS Project Planets

Matt Layman: Fun With Scrapy Link Validation on CI

Planet Python - Fri, 2024-01-05 19:00
Here’s my scenario: I have a static site generator that is building HTML pages for a community project that I’m working on. How can I make sure, automatically, that all the links to other internal pages within the site continue to work? In this article, I’ll show you how I managed to do that using Scrapy, a web scraping tool, and GitHub Actions, the project’s Continuous Integration system. To solve this problem, I decided to use a web scraper.
Categories: FLOSS Project Planets

Valhalla's Things: Blog updates

Planet Debian - Fri, 2024-01-05 19:00
Posted on January 6, 2024
Tags: madeof:bits, meta

After just a tiny1 delay I’ve finally added support for tags to this blog.

In the next few days I may go back and add / change tags to the older posts, or I may not, I’ll decide.

Also, I still need to render a tag cloud somewhere; maybe it will happen soon, maybe it will take another year. :D

I hope I’ve also succesfully worked around the bug in Friendica where the src for images got deleted instead of properly converted.

  1. less than one year!↩︎

Categories: FLOSS Project Planets

FSF Blogs: FSD meeting recap 2024-01-05

GNU Planet! - Fri, 2024-01-05 16:34
Check out the important work our volunteers accomplished at today's Free Software Directory (FSD) IRC meeting.
Categories: FLOSS Project Planets

FSD meeting recap 2024-01-05

FSF Blogs - Fri, 2024-01-05 16:34
Check out the important work our volunteers accomplished at today's Free Software Directory (FSD) IRC meeting.
Categories: FLOSS Project Planets

Four Kitchens: Responsive image best practices for Drupal

Planet Drupal - Fri, 2024-01-05 16:32

Amanda Luker

Senior Frontend Engineer

As a Senior Frontend Engineer at Four Kitchens, Amanda is responsible for translating visual designs and applying them to Drupal sites.

January 1, 1970

Drupal provides a system for site administrators to add their own images and have them appear uniformly on the website. This system is called Image Styles. This tool can resize, crop, and scale images to fit any aspect ratio required by a design.

When creating responsive websites, a single image style for each image variation is insufficient. Each image, such as a hero image, a card image, a WYSIWYG image, or a banner image, requires multiple versions of one image. This ensures that the website delivers only what visitors need based on their screen size. For instance, a mobile user may only require a 320-pixel-wide image, while a large desktop user may want an 1,800-pixel-wide image (doubled for double-pixel density). For this reason, Drupal has Responsive Image Styles, which will group your images into a set of styles that will each show under different conditions.

Practical approach to convert images from design to Drupal
  • Determine your image’s aspect ratio. If you find that the images in the design are not in a common aspect ratio (like 1:1, 2:1, 4:3, or 16:9) or if they vary by a little bit, consider running the dimensions through a tool that will find the closest reasonable aspect ratio.
  • Determine the smallest and largest image sizes. For example, for a 16:9 aspect ratio, the smallest size might be 320 pixels x 180 pixels, while the largest could be 3,200 pixels x 1,800 pixels (doubled for high-density screens).
  • To generate all variations, you can use an AI tool to print images with 160-pixel increments between each size. 160 increments tend to hit a lot of common breakpoints. Here’s an example using GitHub CoPilot:

There are likely more ways to streamline this process with Copilot. I’ve also used ChatGPT to rewrite them using a prefix, making it easy to add them in Drupal like this:

If adding all of these steps seems like a lot of work, consider using the Easy Responsive Images module! This module can create image styles for you, allowing you to set your aspect ratios and the increment between each style.

Once you have all your styles in place, create your responsive image styles by following these steps:

  • Choose a name for your responsive image style based on its usage
  • Select the “responsive image” breakpoint group
  • Usually, I choose to select multiple image styles and use the sizes attribute. Use the sizes attribute to craft your “sizes.” For example:

(min-width:960px) 50vw, (min-width:1200px) 30vw, 100vw

In this example, choosing an image that is smaller than 960 pixels will best fit the full width of the viewport. At 960 pixels, the image will be selected to best fill half of the viewport width, and at 1,200 pixels, 30%. This approach is nimble and allows the browser to choose the most appropriate image for each case.

After setting the size rules, choose all of the image styles that you want the browser to be able to use. You don’t have to use them all. In some cases, you might have two responsive image styles that are pulling from the same aspect ratio image styles, but one uses all of them and the other uses a subset of them.

After adding your responsive image style, you need to map your Media View Mode:

  1. Go to https://[your-site.local]/admin/structure/display-modes/view/add/media
  2. Add the media view mode as a new Display for Images: https://[your-site.local]/admin/structure/media/manage/image/display
  3. Choose “Responsive image” as the Format and select your new responsive image style

Once you have set this up, you are ready to use the View Mode to display the image field for your entity.

In this example, all the images have the same breakpoint. There may be times when you need to have different aspect ratios at different breakpoints. In those cases, you may want to use your custom theme’s Breakpoint Group. This will allow you to manually select each image style on for each breakpoint (instead of letting Drupal choose it for you).

The post Responsive image best practices for Drupal appeared first on Four Kitchens.

Categories: FLOSS Project Planets

Andy Wingo: scheme modules vs whole-program compilation: fight

GNU Planet! - Fri, 2024-01-05 15:43

In a recent dispatch, I explained the whole-program compilation strategy used in Whiffle and Hoot. Today’s note explores what a correct solution might look like.

being explicit

Consider a module that exports an increment-this-integer procedure. We’ll use syntax from the R6RS standard:

(library (inc) (export inc) (import (rnrs)) (define (inc n) (+ n 1)))

If we then have a program:

(import (rnrs) (inc)) (inc 42)

Then the meaning of this program is clear: it reduces to (+ 42 1), then to 43. Fine enough. But how do we get there? How does the compiler compose the program with the modules that it uses (transitively), to produce a single output?

In Whiffle (and Hoot), the answer is, sloppily. There is a standard prelude that initially has a number of bindings from the host compiler, Guile. One of these is +, exposed under the name %+, where the % in this case is just a warning to the reader that this is a weird primitive binding. Using this primitive, the prelude defines a wrapper:

... (define (+ x y) (%+ x y)) ...

At compilation-time, Guile’s compiler recognizes %+ as special, and therefore compiles the body of + as consisting of a primitive call (primcall), in this case to the addition primitive. The Whiffle (and Hoot, and native Guile) back-ends then avoid referencing an imported binding when compiling %+, and instead produce backend-specific code: %+ disappears. Most uses of the + wrapper get inlined so %+ ends up generating code all over the program.

The prelude is lexically splatted into the compilation unit via a pre-expansion phase, so you end up with something like:

(let () ; establish lexical binding contour ... (define (+ x y) (%+ x y)) ... (let () ; new nested contour (define (inc n) (+ n 1)) (inc 42)))

This program will probably optimize (via partial evaluation) to just 43. (What about let and define? Well. Perhaps we’ll get to that.)

But, again here I have taken a short-cut, which is about modules. Hoot and Whiffle don’t really do modules, yet anyway. I keep telling Spritely colleagues that it’s complicated, and rightfully they keep asking why, so this article gets into it.

is it really a big letrec?

Firstly you have to ask, what is the compilation unit anyway? I mean, given a set of modules A, B, C and so on, you could choose to compile them separately, relying on the dynamic linker to compose them at run-time, or all together, letting the compiler gnaw on them all at once. Or, just A and B, and so on. One good-enough answer to this problem is library-group form, which explicitly defines a set of topologically-sorted modules that should be compiled together. In our case, to treat the (inc) module together with our example program as one compilation unit, we would have:

(library-group ;; start with sequence of libraries ;; to include in compilation unit... (library (inc) ...) ;; then the tail is the program that ;; might use the libraries (import (rnrs) (inc)) (inc 42))

In this example, the (rnrs) base library is not part of the compilation unit. Presumably it will be linked in, either as a build step or dynamically at run-time. For Hoot we would want the whole prelude to be included, because we don’t want any run-time dependencies. Anyway hopefully this would expand out to something like the set of nested define forms inside nested let lexical contours.

And that was my instinct: somehow we are going to smash all these modules together into a big nested letrec, and the compiler will go to town. And this would work, for a “normal” programming language.

But with Scheme, there is a problem: macros. Scheme is a “programmable programming language” that allows users to extend its syntax as well as its semantics. R6RS defines a procedural syntax transformer (“macro”) facility, in which the user can define functions that run on code at compile-time (specifically, during syntax expansion). Scheme macros manage to compose lexical scope from the macro definition with the scope at the macro instantiation site, by annotating these expressions with source location and scope information, and making syntax transformers mostly preserve those annotations.

“Macros are great!”, you say: well yes, of course. But they are a problem too. Consider this incomplete library:

(library (ctinc) (import (rnrs) (inc)) (export ctinc) (define-syntax ctinc (lambda (stx) ...)) // ***

The idea is to define a version of inc, but at compile-time: a (ctinc 42) form should expand directly to 43, not a call to inc (or even +, or %+). We define syntax transformers with define-syntax instead of define. The right-hand-side of the definition ((lambda (stx) ...)) should be a procedure of one argument, which returns one value: so far so good. Or is it? How do we actually evaluate what (lambda (stx) ...) means? What should we fill in for ...? When evaluating the transformer value, what definitions are in scope? What does lambda even mean in this context?

Well... here we butt up against the phasing wars of the mid-2000s. R6RS defines a whole system to explicitly declare what bindings are available when, then carves out a huge exception to allow for so-called implicit phasing, in which the compiler figures it out on its own. In this example we imported (rnrs) for the default phase, and this is the module that defines lambda (and indeed define and define-syntax). The standard defines that (rnrs) makes its bindings available both at run-time and expansion-time (compilation-time), so lambda means what we expect that it does. Whew! Let’s just assume implicit phasing, going forward.

The operand to the syntax transformer is a syntax object: an expression annotated with source and scope information. To pick it apart, R6RS defines a pattern-matching helper, syntax-case. In our case ctinc is unary, so we can begin to flesh out the syntax transformer:

(library (ctinc) (import (rnrs) (inc)) (export ctinc) (define-syntax ctinc (lambda (stx) (syntax-case stx () ((ctinc n) (inc n)))))) // ***

But here there’s a detail, which is that when syntax-case destructures stx to its parts, those parts themselves are syntax objects which carry the scope and source location annotations. To strip those annotations, we call the syntax->datum procedure, exported by (rnrs).

(library (ctinc) (import (rnrs) (inc)) (export ctinc) (define-syntax ctinc (lambda (stx) (syntax-case stx () ((ctinc n) (inc (syntax->datum #'n)))))))

And with this, voilà our program:

(library-group (library (inc) ...) (library (ctinc) ...) (import (rnrs) (ctinc)) (ctinc 42))

This program should pre-expand to something like:

(let () (define (inc n) (+ n 1)) (let () (define-syntax ctinc (lambda (stx) (syntax-case stx () ((ctinc n) (inc (syntax->datum #'n)))))) (ctinc 42)))

And then expansion should transform (ctinc 42) to 43. However, our naïve pre-expansion is not good enough for this to be possible. If you ran this in Guile you would get an error:

Syntax error: unknown file:8:12: reference to identifier outside its scope in form inc

Which is to say, inc is not available as a value within the definition of ctinc. ctinc could residualize an expression that refers to inc, but it can’t use it to produce the output.

modules are not expressible with local lexical binding

This brings us to the heart of the issue: with procedural macros, modules impose a phasing discipline on the expansion process. Definitions from any given module must be available both at expand-time and at run-time. In our example, ctinc needs inc at expand-time, which is an early part of the compiler that is unrelated to any later partial evaluation by the optimizer. We can’t make inc available at expand-time just using let / letrec bindings.

This is an annoying result! What do other languages do? Well, mostly they aren’t programmable, in the sense that they don’t have macros. There are some ways to get programmability using e.g. eval in JavaScript, but these systems are not very amenable to “offline” analysis of the kind needed by an ahead-of-time compiler.

For those declarative languages with macros, Scheme included, I understand the state of the art is to expand module-by-module and then stitch together the results of expansion later, using a kind of link-time optimization. You visit a module’s definitions twice: once to evaluate them while expanding, resulting in live definitions that can be used by further syntax expanders, and once to residualize an abstract syntax tree, which will eventually be spliced into the compilation unit.

Note that in general the expansion-time and the residual definitions don’t need to be the same, and indeed during cross-compilation they are often different. If you are compiling with Guile as host and Hoot as target, you might implement cons one way in Guile and another way in Hoot, choosing between them with cond-expand.

lexical scope regained?

What is to be done? Glad you asked, Vladimir. But, I don’t really know. The compiler wants a big blob of letrec, but the expander wants a pearl-string of modules. Perhaps we try to satisfy them both? The library-group paper suggests that modules should be expanded one by one, then stitched into a letrec by AST transformations. It’s not that lexical scope is incompatible with modules and whole-program compilation; the problems arise when you add in macros. So by expanding first, in units of modules, we reduce high-level Scheme to a lower-level language without syntax transformers, but still on the level of letrec.

I was unreasonably pleased by the effectiveness of the “just splat in a prelude” approach, and I will miss it. I even pled for a kind of stop-gap fat-fingered solution to sloppily parse module forms and keep on splatting things together, but colleagues helpfully talked me away from the edge. So good-bye, sloppy: I repent my ways and will make amends, with 40 hail-maries and an alpha renaming thrice daily and more often if in moral distress. Further bulletins as events warrant. Until then, happy scheming!

Categories: FLOSS Project Planets

The Drop Times: Think Licensing to Ensure Responsible Use of AI Tools, Says Lyubomir Filipov

Planet Drupal - Fri, 2024-01-05 15:19
Explore the evolution of Lyubomir Filipov's career from his initial encounter with Drupal to his current role as a Group Architect at FFW Agency, delving into his strategies for team management, overcoming multilingual challenges, community involvement, and the intersection of artificial intelligence with the digital field. Filipov's diverse experiences reflect a multifaceted professional journey, offering insights into Drupal, community engagement, academic pursuits, and the future landscape of digital expertise.
Categories: FLOSS Project Planets

MidCamp - Midwest Drupal Camp: Get your session in now...

Planet Drupal - Fri, 2024-01-05 14:29
Get your session in now...

 

Don't Miss the Last Chance to Submit Your Idea

With over 40 sessions submitted to date, we have a couple of days left for you to get your ideas in for MidCamp 2024!

We’re open to talks for all levels of Drupal users, from beginner through advanced, as well as end users and business owners.

See our session tracks page for a full description of the kinds of talks we are looking for.  If you’re new to speaking, check out the recording of our Speaker Workshop for ideas.

Submit a session now!

Important Dates:

  • Proposal Deadline: January 7 (Sunday), 2024 at midnight CST
  • Tickets on sale: very soon!
  • Early-bird deadline and speakers announced:  February (all speakers are eligible for a free ticket, and anyone who submits a session that is not accepted will be eligible for early-bird pricing even after it closes)
  • MidCamp 2024: March 20-22

Sponsors Get Early, Privileged Access

Get early and privileged access to new talent and customers by sponsoring MidCamp. We have a variety of sponsorship packages available.

Starting at $600, sponsoring organizations can target their jobs to a select group of experienced Drupal talent, maximize exposure by sharing space with dozens of jobs instead of millions, and have three days of being face-to-face with applicants.

Our sponsorship packages are designed to showcase your organization as a supporter of the Drupal community and provide opportunities to:

  • grow your brand,
  • generate leads,
  • and recruit Drupal talent.

Check out the sponsorship packages here, we look forward to working with you to get your organization involved for 2024!

Stay In The Loop

Join the MidCamp Slack and come hang out with the community online. We will be making announcements there from time to time. We’re also on Twitter and Mastodon.

We can’t wait to see you soon! Don’t forget, cancel all those other plans and make MidCamp the only thing happening on your calendar from March 20-22, 2024.

Categories: FLOSS Project Planets

DinoTechno.com: Unlock the Power of Drupal 10: Discover its Key Features and Benefits

Planet Drupal - Fri, 2024-01-05 13:08
Drupal is an open-source content management system (CMS) that has gained significant popularity for its ability to power some of the most prominent and frequently visited websites globally. With a vast developer community of 1.4 million strong, contributing over 50,000 free modules, Drupal offers unparalleled flexibility and power, making it our preferred CMS choice for […]
Categories: FLOSS Project Planets

Pages