Feeds

Seth Michael Larson: AI and Wonder

Planet Python - Tue, 2023-12-26 19:00
AI and Wonder AboutBlogNewsletterLinks AI and Wonder

Published 2023-12-27 by Seth Larson
Reading time: minutes

On Christmas Day, Brandt Bucher, a CPython core developer, created a wonderful thing. Brandt wrote a pull request description for the new copy-and-patch JIT compiler for CPython in the style of the poem “A Visit from St. Nicholas” that describes the changes being made (with references!) and rhymes all the way through. Truly a work of art!

This work undoubtedly took a while to complete and was a source of joy for many people who read the whole thing. This work of art received hundreds of reactions on GitHub, was #1 on HackerNews with 260+ points, and a bunch of love when I shared the URL on Mastodon.


Personal opinions inbound! Vote in this poll before proceeding with the article.

Then I saw a comment that implied being able to have this sort of poem on every pull request might be desirable, or “even more amazing”, and could be done using AI. To experiment with what that might feel like I fired up everyone's favorite GPT and provided the following prompt as input:

Create a rhyming poem in the style of "A Visit from St Nicholas" about adding a Just-in-Time compiler to CPython

Sure enough, in a few moments I received a satisfactory answer including no less than two “jubilant cheers” from Guido. I felt no sense of wonder in reading the generated poem, but with some editing its unclear whether I would know that a single instance wasn't created using a tool instead of by hand. If there were thousands of these works being created within seconds though it would not be hard to figure out that AI was involved.

Wonder or awe is a deeply personal feeling and people experience it differently. It's a feeling that appears to have positive effects on humans, so I am dismayed by the diminished sense of wonder caused by the direction that some digital art is taking. There are knock-on effects too, attempts to blend in AI works amongst human works makes it more difficult than ever to believe or appreciate what you see on the internet at first glance.

This is only my own opinion and there will likely be varied thoughts on this phenomenon. Regardless, I'll continue to support the artists that I enjoy the most and highlighting wondrous works created by humans.

Thanks for reading! ♡ Did you find this article helpful and want more content like it? Get notified of new posts by subscribing to the RSS feed or the email newsletter.

This work is licensed under CC BY-SA 4.0

Categories: FLOSS Project Planets

eiriksm.dev: Social historical code archeology: cronner.module

Planet Drupal - Tue, 2023-12-26 17:01

Violinist.io is well into its seventh year. It has survived a couple of Drupal core major upgrades and remains a profitable SaaS based on Drupal and a vital tool for agencies and organizations. I'm not sure if there is such a thing as social historical code archeology, but if there was, then this blog post might be it. I have long wanted to write some blog posts about the development of violinist.io that not only focused on exciting or interesting technical aspects. Instead I think this will highlight when things do not go as planned, why that was, and maybe something around what it meant at the time comparing to present day?

The project started as a PHP file and proof of concept in Drupal Developer Days Seville 2017. Literally. It was one PHP script placed inside one file. After I got that working I figured I would make it a composer package and run it with Drupal cron. This was the first version of violinist.io. Run the method of a composer package with cron. I made a module to contain this hook_cron implementation. I called it cronner. It was the module to do things on cron so that was the name. The cronner module.

cronner.module was first checked in to git with the revision 55da4a9c:

From 55da4a9c Date: Thu, 23 Mar 2017 08:38:57 +0100 Subject: [PATCH] This is a site now?

 

As we may deduce from the commit message, this was a pretty early one in the codebase. More precisely, this was the second commit to the codebase, after some initial scaffolding files. At that point, this was the entire contents of cronner.module:

<?php /** * Implements hook_cron(). */ function cronner_cron() { $nodes = \Drupal::entityTypeManager() ->getStorage('node') ->loadByProperties([ 'type' => 'project' ]); $q = \Drupal::queue('cronner_project'); foreach ($nodes as $node) { $q->createItem($node); } }

Basically a cron implementation to queue all projects on the site periodically.

Fast forward to 2023 and some reflections around maintaining a codebase. Cronner. I can honestly say that cronner is among my least proud namings in the codebase. The hook_cron implementation is long gone and replaced with all sorts of fancy cloud computing. The name really is misleading now. It contains no cron implementations like that any more. Instead it contains (among other things):

  • Schema for the database table called cronner_log
  • Some random theme hooks used here and there
  • A hook_node_insert implementation that is still the one responsible for adding new projects to the update queue

I have many times wanted to work on extracting parts of cronner.module into separate modules. But fixing bugs, improving test coverage or implementing new features has always been the priority. At this point, from time to time I open the file cronner.module and shudder. But after that initial feeling of annoyance or shame I instead focus on the image of something like the origin life form. Something like a web of branches and roots coming out of a mother plant. Something almost alien-like. The module that started it all, the breathing and growing mother of the codebase.

Some might call it legacy. I call it cronner, the origin mother of the codebase.

Categories: FLOSS Project Planets

Video Recommendation: Coping with Other People's Code - Laura Savino

Planet KDE - Tue, 2023-12-26 16:06
KDE Project:

As someone suffering from a latent burnout thingy which has become more imminent in recent years and as someone who is still struggling to develop strategies to alleviate its effects on health and general well-being, I wholeheartedly recommend everyone to watch this video and let those points sink. Yes, even if you are not (yet) affected. The video is not all about burnout but about strategies for sustaining long-term sanity.

https://www.youtube.com/watch?v=qyz6sOVON68

Enjoy. :)

Categories: FLOSS Project Planets

PyCoder’s Weekly: Issue #609 (Dec. 26, 2023)

Planet Python - Tue, 2023-12-26 14:30

#609 – DECEMBER 26, 2023
View in Browser »

A lot has happened in the Python ecosystem in 2023 and with our final issue of the year, we’re recapping the most popular links in PyCoders this year. That’s right, the most clicked articles, discussion, and the five most clicked projects as well.

Here’s to you, dear reader. Thanks for continuing to be with us at PyCoder’s Weekly. I’m sure 2024 will be just as interesting as 2023. And if in 2024 you come across a cool article or a project you think deserves notice, send it to us.

Happy Pythoning!

— The PyCoder’s Weekly Team
    Christopher Trudeau, Curator
    Dan Bader, Editor

Design and Guidance: Object-Oriented Programming in Python

In this video course, you’ll learn about the SOLID principles, which are five well-established standards for improving your object-oriented design in Python. By applying these principles, you can create object-oriented code that is more maintainable, extensible, scalable, and testable.
REAL PYTHON course

Python 3.12 Preview: More Intuitive and Consistent F-Strings

In this tutorial, you’ll preview one of the upcoming features of Python 3.12, which introduces a new f-string syntax formalization and implementation. The new implementation lifts some restrictions and limitations that affect f-string literals in Python versions lower than 3.12.
REAL PYTHON

Learning About Code Metrics in Python With Radon

Radon is a code metrics tool. This article introduces you to it and teaches you how you can improve your code based on its measurements.
MIKE DRISCOLL

Speeding Up Your Code When Multiple Cores Aren’t an Option

Parallelism isn’t the only answer: often you can optimize low-level code to get significant performance improvements.
ITAMAR TURNER-TRAURING

Discussions “Why Python Is Terrible”

HACKER NEWS

Kill a Developer in 4 Words or Less 😂

Some favourites: “Let’s deploy this Friday!”, “Works on my machine!”, “You’ve got merge conflicts”
TWITTER

Python Jobs Senior Python Architect and Tech Lead (America)

Six Feet Up

Python Tutorial Editor (Anywhere)

Real Python

Software Engineer - Intern (Summer 2024) (Dallas, TX, USA)

Causeway Capital Management

More Python Jobs >>>

Articles & Tutorials How to Catch Multiple Exceptions in Python

In this how-to tutorial, you’ll learn different ways of catching multiple Python exceptions. You’ll review the standard way of using a tuple in the except clause, but also expand your knowledge by exploring some other techniques, such as suppressing exceptions and using exception groups.
REAL PYTHON

78% MNIST Accuracy Using GZIP in Under 10 Lines of Code

MNIST is a collection of hand-written digits that is commonly used to play with classification algorithms. It turns out that some compression mechanisms can double as classification tools. This article covers a bit of why with the added code-golf goal of a small amount of code.
JAKOBS.DEV

ChatGPT: Your Personal Python Coding Mentor

Large language models have gained popularity since OpenAI released ChatGPT. In this tutorial, you’ll learn how to use ChatGPT as your Python coding mentor. You’ll study a variety of use cases, learn how to interpret results, and learn to beware of incorrect and irrelevant responses.
REAL PYTHON

SOLID Principles: Improve Object-Oriented Design in Python

In this tutorial, you’ll learn about the SOLID principles, which are five well-established standards for improving your object-oriented design in Python. By applying these principles, you can create object-oriented code that is more maintainable, extensible, scalable, and testable.
REAL PYTHON

Mojo, a Superset of Python

Mojo is a new programming language, which is a superset of Python. It aims to fix Python’s performance and deployment problems.
JEREMY HOWARD

6 Cool Things You Can Do With the functools Module

The functools module in the standard library has all sorts of useful bits and pieces. This article talks about six of them: caching, writing fewer dunder methods, freeze functions, generic functions, better decorators, and reduce().
BOB BELDERBOS

Discover bpython: A Python REPL With IDE-Like Features

In this tutorial, you’ll learn about bpython, an alternative Python REPL that brings code suggestions and many other IDE-like features to the terminal. Once you discover how much bpython can improve your productivity, you’ll never want to return to using the vanilla Python REPL again.
REAL PYTHON

Python Classes: The Power of Object-Oriented Programming

In this tutorial, you’ll learn how to create and use full-featured classes in your Python code. Classes provide a great way to solve complex programming problems by approaching them through models that represent real-world objects.
REAL PYTHON

How to Annotate Methods That Return self

In this tutorial, you’ll learn how to use the Self type hint in Python to annotate methods that return an instance of their own class. You’ll gain hands-on experience with type hints and annotations of methods that return an instance of their class, making your code more readable and maintainable.
REAL PYTHON

Boost Your Coding Productivity With Ptpython

Learn how to enhance your Python development workflow with auto-completion, syntax highlighting, history navigation, and more. In this tutorial, you’ll walk through the fundamentals of ptpython, covering installation, basic usage, and advanced features.
REAL PYTHON

AsyncIO: Why I Hate It

Charles is the creator of Peewee ORM and often gets the question “when will it support asyncio?” In this opinion piece he talks about why he doesn’t like asyncio and the alternatives he prefers.
CHARLES LEIFER opinion

Projects & Code nicegui: Create Web-Based UI With Python

GITHUB.COM/ZAUBERZEUG

Comprehensive Python Cheatsheet

GTO76.GITHUB.IO

csvkit: A Suite of CSV Utilities

GITHUB.COM/WIRESERVICE

pynimate: Python Package for Statistical Data Animations

GITHUB.COM/JULKAAR9

Cross Platform GUI Framework Based on HTML/CSS

GITHUB.COM/SCRIPTIOT • Shared by dragondjf

Happy Pythoning!
This was PyCoder’s Weekly Issue #609.
View in Browser »

[ Subscribe to 🐍 PyCoder’s Weekly 💌 – Get the best Python news, articles, and tutorials delivered to your inbox once a week >> Click here to learn more ]

Categories: FLOSS Project Planets

TechBeamers Python: How to Get the Current Timestamp as a String in Python

Planet Python - Tue, 2023-12-26 12:49

Dealing with timestamps is common in programming, especially when you need to record events or log information. In Python, there are various ways to get the current timestamp as a string. In this simple guide, we’ll explore different methods with examples that anyone can understand. 7 Ways to Get the Current Timestamp in Python Before […]

The post How to Get the Current Timestamp as a String in Python appeared first on TechBeamers.

Categories: FLOSS Project Planets

Redfin Solutions: Event Organizers Working Group (EOWG)

Planet Drupal - Tue, 2023-12-26 10:31
Leslie Glynn, Manager of Customer Success and Project Browser Initiative lead, talks about the drupal EOWG.
Categories: FLOSS Project Planets

The Drop Times: Moments to Treasure

Planet Drupal - Tue, 2023-12-26 09:04

Dear Drupal Community,

As the holiday season embraces us, I extend warm Seasons' Greetings from The DropTimes Family! These cherished moments, surrounded by our loved ones, savoring comforting meals, and cherishing each instance together, signify what many consider the most beautiful time of the year.

Amidst this season of togetherness, let's not overlook the parallels between these heartwarming moments and the essence of Drupal. Just as we cherish our connections with family and friends, Drupal thrives on the power of connections—forging a community built on collaboration, shared passion, and innovation.

This extended holiday period offers us a chance to revel in the warmth of our relationships and celebrate the robust network of connections within the Drupal community. It’s a time to pause, reflect, and appreciate the bonds that enrich our personal and professional lives.

As we approach the culmination of the year, surrounded by laughter and love, let’s acknowledge the significance of these connections. Just as the Drupal community comes together to innovate and create, let us also embrace the spirit of unity and collaboration, nurturing relationships and supporting one another.

May these moments of togetherness and shared joy serve as a poignant reminder of the importance of connections—both personal and within our Drupal family—filling our hearts with gratitude, warmth, and a renewed sense of community.

Warm wishes for a joyful holiday season and a Happy New Year!

Last week, The DropTimes had a captivating interview with Daniel Angelov, exploring his remarkable journey in accessibility advocacy. Kazima Abbas skillfully conducted this insightful conversation. Meanwhile, Alka Elizabeth had an engaging email exchange with Jeff Eaton, delving into the Token Module's origin, progress, and prospects. I had the pleasure to capture the festive Christmas celebrations held at the offices of Zoocha and SparkFabrik. You can check it out here . I also had the chance to delve deeper into the Drupal Project Browser Initiative, where Christopher Wells is playing a pivotal role. This initiative represents a significant step forward in enhancing Drupal's user experience and accessibility. 

Exciting news from the Drupal 7 security initiative: Klaus Purer, Ivan Tibezh, Juraj Falat, and Andrii Cheredn have joined as new members! Explore their valuable contributions here. In recent updates, a moderately critical security risk has been identified in the data visualization framework module within Drupal. Dominique De Cooman, Dropsolid's Co-Founder and Co-CEO, achieves a remarkable milestone by securing a seat on the esteemed Mautic Community Council. Discover more about this remarkable achievement in the news article.
Introducing the Drupal Geysir Module! This latest release brings a collection of user interface improvements aimed at streamlining content authors' daily workflow. Dive into its transformative impact on Drupal's user experience right here. Guided by CEO Tim Doyle, the Drupal Association achieved significant milestones in 2023 through a visionary 3-year plan, fostering innovation, expanding outreach, and boosting support. Read more about it here.

Exciting news from DrupalCamp Belgium 2024. The call for speakers is now open until March 10, 2024. Don't miss this opportunity to share your insights and participate in this dynamic event. Join the exceptional Drupal Mountain Camp 2024 - organizers seek potential sponsors for this remarkable event. Click here for details on involvement. Dive into the dynamic world of design systems on January 17, 2024, at iO Campus Amsterdam during the 'Insightful Design Systems' meetup. Professionals across fields are invited to explore the intricacies of design systems in this engaging event.

At DrupalCon Portland 2024, Julia Kranzthor, Director of Philanthropy at the Drupal Association, unveils the return of the Nonprofit Summit, fostering collaboration and innovation for nonprofits leveraging Drupal.

 Fresh off the press: Adam Bergstein's latest release, "Drupal 10 Masterclass: A Comprehensive Guide for Beginners," is now available on Amazon. Tailored specifically for beginners, this book is a must-read for those diving into Drupal.

While more stories beckon for exploration, constraints compel us to halt further selection. To get timely updates, follow us on LinkedIn, Twitter and Facebook.

Thank you,

With warm wishes,
Elma John
Sub-Editor, TheDropTimes

Categories: FLOSS Project Planets

TechBeamers Python: How to Check If Python List is Empty

Planet Python - Tue, 2023-12-26 07:05

Checking whether a Python list is empty is a fundamental task in programming. An empty list often signals the absence of data or the need for specific handling. In this tutorial, we will explore various methods to check if a Python list is empty, covering different aspects and providing multiple examples. Understanding these techniques is […]

The post How to Check If Python List is Empty appeared first on TechBeamers.

Categories: FLOSS Project Planets

Specbee: A Deep Dive into the Webform Module for Drupal 10

Planet Drupal - Tue, 2023-12-26 06:47
The Webform module is the most powerful and flexible form builder and submission manager for Drupal. It gives  site builders the power to easily create complex forms instantly. It comes with a certain level of default settings, also letting you customize it as per your requirements. Check out this amazing blog - Drupal 9 Webform Module - A Brief Tutorial to help you get started with the Webform module for your Drupal 9 or 10 website. This will help you understand the basics easily.  The Webform module ships with tons of interesting features! Allow me to present a concise overview of some of the noteworthy features and functionalities that make this module a powerhouse in the world of Drupal.  This is an updated version of the article last published on 12 October 2021, designed to keep you in the loop with Drupal 10 and its latest developments. Altering form & elements Any form, element and its related settings can be altered by using their respective hooks. Below are few hooks that are available to use and you can find more in the webform.api.php file: Form hooks         ◦ hook_webform_submission_form_alter()            ◦ Perform alterations before a webform submission form is rendered. Element hooks         ◦ hook_webform_element_alter()        ◦ Alter webform elements. Option hooks         ◦ hook_webform_options_alter()        ◦ Alter webform options. Handler hooks         ◦ hook_webform_handler_invoke_alter()        ◦ Act on a webform handler when a method is invoked. more hooks…         ◦ hook_webform_access_rules_alter() etc..        ◦ Alter list of access rules that should be managed on per webform level. YAML Source The Webform module started as a YAML Form module, which allowed people to build forms by writing YAML markup. At some point, the YAML Form module started to have UI and became the Webform module for Drupal 8. YAML provides a simple & easy to learn markup language for building & bulk editing a webform's elements. The (View) Source page allows developers to edit a webform's render array using YAML markup. Developers can use the (View) Source page to hand-code webforms to alter a webform's labels quickly, cut-n-paste multiple elements, reorder elements, as well as add custom properties and markup to elements. Here’s an example of a Contact form and its corresponding YAML source code:   A Contact form with drag-n-drop UI   The Contact form’s YAML source code Conditional fields Webform lets you add conditional logic to your elements within your form. Let us consider a small example, wherein we need to conditionally handle the visibility of elements based on the value of another element within the form. Here’s an example form with two-step fields, STEP 1 (Radios element) with options ‘Email’ and ‘Mobile Number’. STEP 2 (Fieldset) with two elements, 'Email' and 'Mobile Number'. Form Build page Form View page In the above example, I would like to show the ‘Email’ field if the ‘Email’ option is chosen in Step 1, else show the ‘Mobile Number’ field if the ‘Mobile Number’ option is chosen in Step 1. To achieve that, edit your ‘Email’ field, click on the ‘Conditions’ tab, choose the ‘State’ as ‘Visible’ and set the ‘Trigger/Value’ as ‘STEP 1 [Radios] value is email’. Similarly, follow the same steps to add conditional logic to your ‘Mobile Number’ field and set the ‘Trigger/Value’ as ‘STEP 1 [Radios] value is mobile_number’. Here’s the final look of the Webform: Setting up conditional logic Form when ‘Email’ is chosen on STEP 1 Form when ‘Mobile Number’ is chosen on STEP 1 Custom options properties Webform lets you add custom option properties to your from elements. Imagine a scenario where you would want to conditionally handle the options of a radio element based on the value of a different element within the form. How would you do that? Well, I did not find any way to handle it through the Conditional logic settings from the UI. But there is a provision to set ‘custom options properties’ to your element, wherein you write the required conditional logic targeting your options within the element using the YAML code. Here is an example, where we can see two radio elements and based on the option I select in the first element, the visibility of the options within the second element should change. Form Build page Form View page before adding any custom options properties: If ‘Type A’ is chosen, then ‘Option 1’ and ‘Option 2’ should be visible from the second element. Similarly, if ‘Type B’ is chosen, then ‘Option 3’ and ‘Option 4’ should be visible.  To achieve this edit the second element, go to the ‘Advanced’ tab, scroll down to the ‘Options (custom) properties’ sections and write the necessary logic in YAML. Setting up the options properties Form when ‘Type A’ is chosen Form when ‘Type B’ is chosen Webform submission email handlers Email handlers Email handlers sends a webform submission via email. To add email handlers to your webform, go to ‘Settings’ and then ‘Emails/Handlers’ tab. Next, click on the ‘Add Email / Add handler’ button. Add Email Handler As shown in the below image, on the ‘General’ tab, add the ‘Title’ and set ‘Send to’  and ‘Send from’ details. Add the message ‘Subject’ and ‘Body ‘ as required and save the configuration form.    And that’s about it. Your handler gets fired whenever the form is submitted. You can also set conditional email handlers to your webform i.e., trigger different email handlers based on the value of certain elements within the form. For example, let us consider a ‘Select’ element with values ‘Type 1’ and ‘Type 2’. If the user submits ‘Type 1’, trigger the ‘Email - Type 1’ handler that has set ‘To’ address to ‘user1@gmail.com’. If the user submits ‘Type 2’, trigger the ‘Email - Type 2’ handler that has set ‘To’ address to ‘user2@gmail.com’. To add conditional logic to your email handler, create one handler and name it ‘Email - Type 1’. Set ‘To’ address to ‘user1@mail.com’, switch to the ‘Conditions’ Tab, choose the ‘State’ as ‘Visible’ and set the ‘Trigger/Value’ as ‘Select Type [Select] value is type_1’.  Similarly, create the second handler and name it ‘Email - Type 2’. Set ‘To’ address to ‘user2@gmail.com’, switch to the ‘Conditions’ Tab, choose the ‘State’ as ‘Visible’ and set the ‘Trigger/Value’ as ‘Select Type [Select] value is type_2’.   Scheduled email handlers It extends the Webform module's email handler to allow emails to be scheduled. To use this feature, enable the ‘Webform Scheduled Email Handler’ sub-module.  To schedule sending an email of the form submissions, click on the ‘Add handler’ button. Select ‘Scheduled Email’ handler here.   There is only one extra config setting in the ‘Scheduled Email’ handler compared to the normal ‘Email handler’. And that is to add Schedule email date under the General settings tab.   Scheduled email handler Set the date to trigger your handler and when the next cron is run, your email will be sent!  Generate PDF copies of Webform submission The webform module lets us download and export webform submissions as PDF documents. It also supports sending these PDF documents as email attachments and customising the templates and CSS of the PDF documents. How do we get this working? To begin with, you need to enable the Webform Entity Print (PDF) submodule (to download and export the PDF documents) and Webform Entity Print (PDF) Attachment submodule (to send these PDF documents as email attachments). Please note: It requires the Entity Print module to be installed which allows you to print any Drupal entity (Drupal 7, 9 and 10) or View (Drupal 9 and 10 only) to PDF. Download the PDF of a single submission Visit your webforms results page, view the submission you want to download and scroll down to click on ‘Download PDF’. Click on ‘Download PDF’   Downloaded PDF Export multiple webform submissions as a PDP document in a zip file  Visit your webforms results page, go to the Download exports tab, choose export format as ‘PDF document’ and hit Download.   You will get all your webform submissions as a PDF in a single *.tar.gz or *.zip file.  Choose Export format as ‘PDF documents Customise the templates and CSS of the PDF documents Go to your webforms general settings tab (/admin/structure/webform/manage/{your-webform-name}/settings), and scroll down to the ‘Entity print’ settings section.  Add necessary header and footer template details (it supports tokens) and some CSS to be shown in your PDF.   Customising your PDF by adding a default header, footer and some CSS PDF after customizing Attach these PDF documents to your emails Start by adding an element called PDF to your webform. Add the PDF element details like its title, view mode, file name etc and save it. Click on ‘Add element’ Add PDF element details The PDF element is now added to your results column (Please note: we are not adding it to our form). When you click on it, you will be redirected to your PDF document. The next step is to attach it to your emails. For that, go into Settings > Email/Handlers. Edit any of your email handlers, check the PDF element and enable ‘Include files as attachments’ and save the handler. Enable ‘Include files as attachments Next time you submit the webform, the PDF will be attached to your email along with webform submission details. Contact.pdf is attached in the email attachments Finding Help There are different ways through which you can seek help with the webform module. Here is a list of a few sources: Documentation, Cookbook, & Screencasts - https://www.drupal.org/docs/contributed-modules/webform Webform Issue Queue - https://www.drupal.org/project/issues/webform Drupal Answers - http://drupal.stackexchange.com Slack channel - You can always post your queries regarding the Webform module at the #webform channel within the Drupal slack workspace. Anyone from the community, even the module maintainer themselves are always around and are kind enough to guide you with your problems. A HUGE shoutout to Jacob Rockowitz for his relentless support towards the Drupal 9/10 Webform module. Webform wouldn't have been what it is now without him. Final Thoughts There’s just so much this versatile Webform module can do that we cannot possibly cover it all in one article. If you love Webforms as much as we do, we’d encourage you to get involved and contribute in any way possible. Looking for professional Drupal services and expertise to build your next Drupal website with beneficial modules such as this? We would love to hear from you. 
Categories: FLOSS Project Planets

Golems GABB: Enhancing Site Performance with Drupal's BigPipe Module

Planet Drupal - Tue, 2023-12-26 06:04
Enhancing Site Performance with Drupal's BigPipe Module Editor Tue, 12/26/2023 - 15:09

Nowadays, the stakes are getting higher and higher. Speed is of the essence for an ordinary user, and each web page and developer has to comply with it. This is where Drupal's BigPipe module takes the stage. 
BigPipe is not a simple addition but a powerful tool for ensuring faster page loading times. It is achieved by fragmenting a web page into smaller units and prioritizing their loading order. Following this logic, the most crucial parts will be loaded first, while the others will catch up later. Let's delve deeper into this marvelous technology and learn more about web page optimization using Drupal's BigPipe module

Categories: FLOSS Project Planets

LN Webworks: Top 7 Secrets No One Told You For The Success of Drupal Web Development Projects

Planet Drupal - Tue, 2023-12-26 04:52

Drupal has solidified its position as a go-to content management system and web development framework, earning the trust of businesses and developers worldwide. As more organizations lean on Drupal for their web development needs, the demand for top-tier projects, handled by a seasoned Drupal development company, is on the rise.

In this article, we're delving into the seven secrets that set the best Drupal web development projects apart. These are some core industry insights from the top Drupal website development companies for higher chances of project success. 

Categories: FLOSS Project Planets

The Drop Times: The Essential API Client for Seamless JavaScript Integration

Planet Drupal - Tue, 2023-12-26 02:54
Explore the unfolding saga of Drupal's leap into JavaScript with the official API Client. Delve into insights from contributors Coby Sher and Pratik Kamble, offering a glimpse into the initiative's transformative journey. Discover how this development is set to reshape the Drupal and JavaScript integration.
Categories: FLOSS Project Planets

TechBeamers Python: Multiple Ways to Loop Through a Dictionary in Python

Planet Python - Tue, 2023-12-26 00:09

Dictionaries are a fundamental data structure in Python, offering a powerful way to store and organize data. When it comes to working with dictionaries, one common task is iterating through their key-value pairs. In this tutorial, we will delve into various methods to loop through a dictionary in Python, exploring different aspects and providing multiple […]

The post Multiple Ways to Loop Through a Dictionary in Python appeared first on TechBeamers.

Categories: FLOSS Project Planets

Russ Allbery: Review: A Study in Honor

Planet Debian - Mon, 2023-12-25 22:37

Review: A Study in Honor, by Claire O'Dell

Series: Janet Watson Chronicles #1 Publisher: Harper Voyager Copyright: July 2018 ISBN: 0-06-269932-6 Format: Kindle Pages: 295

A Study in Honor is a near-future science fiction novel by Claire O'Dell, a pen name for Beth Bernobich. You will see some assertions, including by the Lambda Literary Award judges, that it is a mystery novel. There is a mystery, but... well, more on that in a moment.

Janet Watson was an Army surgeon in the Second US Civil War when New Confederacy troops overran the lines in Alton, Illinois. Watson lost her left arm to enemy fire. As this book opens, she is returning to Washington, D.C. with a medical discharge, PTSD, and a field replacement artificial arm scavenged from another dead soldier. It works, sort of, mostly, despite being mismatched to her arm and old in both technology and previous wear. It does not work well enough for her to resume her career as a surgeon.

Watson's plan is to request a better artificial arm from the VA (the United States Department of Veterans Affairs, which among other things is responsible for the medical care of wounded veterans). That plan meets a wall of unyielding and uninterested bureaucracy. She has a pension, but it's barely enough for cheap lodging. A lifeline comes in the form of a chance encounter with a former assistant in the Army, who has a difficult friend looking to split the cost of an apartment. The name of that friend is Sara Holmes.

At this point, you know what to expect. This is clearly one of the many respinnings of Arthur Conan Doyle. This time, the setting is in the future and Watson and Holmes are both black women, but the other elements of the setup are familiar: the immediate deduction that Watson came from the front, the shared rooms (2809 Q Street this time, sacrificing homage for the accuracy of a real address), Holmes's tendency to play an instrument (this time the piano), and even the title of this book, which is an obvious echo of the title of the first Holmes novel, A Study in Scarlet.

Except that's not what you'll get. There are a lot of parallels and references here, but this is not a Holmes-style detective novel.

First, it's only arguably a detective novel at all. There is a mystery, which starts with a patient Watson sees in her fallback job as a medical tech in the VA hospital and escalates to a physical attack, but that doesn't start until a third of the way into the book. It certainly is not solved through minute clues and leaps of deduction; instead, that part of the plot has the shape of a thriller rather than a classic mystery. There is a good argument that the thriller is the modern mystery novel, so I don't want to overstate my case, but I think someone who came to this book wanting a Doyle-style mystery would be disappointed.

Second, the mystery is not the heart of this book. Watson is. She, like Doyle's Watson, is the first-person narrator, but she is far more present in the book. I have no idea how accurate O'Dell's portrayal of Watson's PTSD is, but it was certainly compelling and engrossing reading. Her fight for basic dignity and her rage at the surface respect and underlying disinterested hostility of the bureaucratic war machinery is what kept me turning the pages. The mystery plot is an outgrowth of that and felt more like a case study than the motivating thread of the plot.

And third, Sara Holmes... well, I hesitate to say definitively that she's not Sherlock Holmes. There have been so many versions of Holmes over the years, even apart from the degree to which a black woman would necessarily not be like Doyle's character. But she did not remind me of Sherlock Holmes. She reminded me of a cross between James Bond and a high fae.

This sounds like a criticism. It very much is not. I found this high elf spy character far more interesting than I have ever found Sherlock Holmes. But here again, if you came into this book hoping for a Holmes-style master detective, I fear you may be wrong-footed.

The James Bond parts will be obvious when you get there and aren't the most interesting (and thankfully the misogyny is entirely absent). The part I found more fascinating is the way O'Dell sets Holmes apart by making her fae rather than insufferable. She projects effortless elegance, appears and disappears on a mysterious schedule of her own, thinks nothing of reading her roommate's diary, leaves meticulously arranged gifts, and even bargains with Watson for answers to precisely three questions. The reader does learn some mundane explanations for some of this behavior, but to be honest I found them somewhat of a letdown. Sara Holmes is at her best as a character when she tacks her own mysterious path through a rather grim world of exhausted war, penny-pinching bureaucracy, and despair, pursuing an unexplained agenda of her own while showing odd but unmistakable signs of friendship and care.

This is not a romance, at least in this book. It is instead a slowly-developing friendship between two extremely different people, one that I thoroughly enjoyed.

I do have a couple of caveats about this book. The first is that the future US in which it is set is almost pure Twitter doomcasting. Trump's election sparked a long slide into fascism, and when that was arrested by the election of a progressive candidate backed by a fragile coalition, Midwestern red states seceded to form the New Confederacy and start a second civil war that has dragged on for nearly eight years. It's a very specific mainstream liberal dystopian scenario that I've seen so many times it felt like a cliche even though I don't remember seeing it in a book before. This type of future projection of current fears is of course not new for science fiction; Cold War nuclear war novels are probably innumerable. But I had questions, such as how a sparsely-populated, largely non-industrial, and entirely landlocked set of breakaway states could maintain a war footing for eight years. Despite some hand-waving about covert support, those questions are not really answered here.

The second problem is that the ending of this book kind of falls apart. The climax of the mystery investigation is unsatisfyingly straightforward, and the resulting revelation is a hoary cliche. Maybe I'm just complaining about the banality of evil, but if I'd been engrossed in this book for the thriller plot, I think I would have been annoyed.

I wasn't, though; I was here for the characters, for Watson's PTSD and dogged determination, for Sara's strangeness, and particularly for the growing improbable friendship between two women with extremely different life experiences, emotions, and outlooks. That part was great, regardless of the ending.

Do not pick this book up because you want a satisfying deductive mystery with bumbling police and a blizzard of apparently inconsequential clues. That is not at all what's happening here. But this was great on its own terms, and I will be reading the sequel shortly. Recommended, although if you are very online expect to do a bit of eye-rolling at the setting.

Followed by The Hound of Justice.

Rating: 8 out of 10

Categories: FLOSS Project Planets

Russ Allbery: krb5-strength 3.3

Planet Debian - Mon, 2023-12-25 22:24

krb5-strength is a toolkit of plugins and support programs for password strength checking for Kerberos KDCs, either Heimdal or MIT. It also includes a password history mechanism for Heimdal KDCs.

This is a maintenance release, since there hadn't been a new release since 2020. It contains the normal sort of build system and portability updates, and the routine bump in the number of hash iterations used for the history mechanism to protect against (some) brute force attacks. It also includes an RPM spec file contributed by Daria Phoebe Brashear, and some changes to the Perl dependencies to track current community recommendations.

You can get the latest version from the krb5-strength distribution page.

Categories: FLOSS Project Planets

Sergio Talens-Oliag: GitLab CI/CD Tips: Automatic Versioning Using semantic-release

Planet Debian - Mon, 2023-12-25 18:30

This post describes how I’m using semantic-release on gitlab-ci to manage versioning automatically for different kinds of projects following a simple workflow (a develop branch where changes are added or merged to test new versions, a temporary release/#.#.# to generate the release candidate versions and a main branch where the final versions are published).

What is semantic-release

It is a Node.js application designed to manage project versioning information on Git Repositories using a Continuous integration system (in this post we will use gitlab-ci)

How does it work

By default semantic-release uses semver for versioning (release versions use the format MAJOR.MINOR.PATCH) and commit messages are parsed to determine the next version number to publish.

If after analyzing the commits the version number has to be changed, the command updates the files we tell it to (i.e. the package.json file for nodejs projects and possibly a CHANGELOG.md file), creates a new commit with the changed files, creates a tag with the new version and pushes the changes to the repository.

When running on a CI/CD system we usually generate the artifacts related to a release (a package, a container image, etc.) from the tag, as it includes the right version number and usually has passed all the required tests (it is a good idea to run the tests again in any case, as someone could create a tag manually or we could run extra jobs when building the final assets …​ if they fail it is not a big issue anyway, numbers are cheap and infinite, so we can skip releases if needed).

Commit messages and versioning

The commit messages must follow a known format, the default module used to analyze them uses the angular git commit guidelines, but I prefer the conventional commits one, mainly because it’s a lot easier to use when you want to update the MAJOR version.

The commit message format used must be:

<type>(optional scope): <description> [optional body] [optional footer(s)]

The system supports three types of branches: release, maintenance and pre-release, but for now I’m not using maintenance ones.

The branches I use and their types are:

  • main as release branch (final versions are published from there)
  • develop as pre release branch (used to publish development and testing versions with the format #.#.#-SNAPSHOT.#)
  • release/#.#.# as pre release branches (they are created from develop to publish release candidate versions with the format #.#.#-rc.# and once they are merged with main they are deleted)

On the release branch (main) the version number is updated as follows:

  1. The MAJOR number is incremented if a commit with a BREAKING CHANGE: footer or an exclamation (!) after the type/scope is found in the list of commits found since the last version change (it looks for tags on the same branch).
  2. The MINOR number is incremented if the MAJOR number is not going to be changed and there is a commit with type feat in the commits found since the last version change.
  3. The PATCH number is incremented if neither the MAJOR nor the MINOR numbers are going to be changed and there is a commit with type fix in the the commits found since the last version change.

On the pre release branches (develop and release/#.#.#) the version and pre release numbers are always calculated from the last published version available on the branch (i. e. if we published version 1.3.2 on main we need to have the commit with that tag on the develop or release/#.#.# branch to get right what will be the next version).

The version number is updated as follows:

  1. The MAJOR number is incremented if a commit with a BREAKING CHANGE: footer or an exclamation (!) after the type/scope is found in the list of commits found since the last released version.

    In our example it was 1.3.2 and the version is updated to 2.0.0-SNAPSHOT.1 or 2.0.0-rc.1 depending on the branch.

  2. The MINOR number is incremented if the MAJOR number is not going to be changed and there is a commit with type feat in the commits found since the last released version.

    In our example the release was 1.3.2 and the version is updated to 1.4.0-SNAPSHOT.1 or 1.4.0-rc.1 depending on the branch.

  3. The PATCH number is incremented if neither the MAJOR nor the MINOR numbers are going to be changed and there is a commit with type fix in the the commits found since the last version change.

    In our example the release was 1.3.2 and the version is updated to 1.3.3-SNAPSHOT.1 or 1.3.3-rc.1 depending on the branch.

  4. The pre release number is incremented if the MAJOR, MINOR and PATCH numbers are not going to be changed but there is a commit that would otherwise update the version (i.e. a fix on 1.3.3-SNAPSHOT.1 will set the version to 1.3.3-SNAPSHOT.2, a fix or feat on 1.4.0-rc.1 will set the version to 1.4.0-rc.2 an so on).
How do we manage its configuration

Although the system is designed to work with nodejs projects, it can be used with multiple programming languages and project types.

For nodejs projects the usual place to put the configuration is the project’s package.json, but I prefer to use the .releaserc file instead.

As I use a common set of CI templates, instead of using a .releaserc on each project I generate it on the fly on the jobs that need it, replacing values related to the project type and the current branch on a template using the tmpl command (lately I use a branch of my own fork while I wait for some feedback from upstream, as you will see on the Dockerfile).

Container used to run it

As we run the command on a gitlab-ci job we use the image built from the following Dockerfile:

Dockerfile # Semantic release image FROM golang:alpine AS tmpl-builder #RUN go install github.com/krakozaure/tmpl@v0.4.0 RUN go install github.com/sto/tmpl@v0.4.0-sto.2 FROM node:lts-alpine COPY --from=tmpl-builder /go/bin/tmpl /usr/local/bin/tmpl RUN apk update &&\ apk upgrade &&\ apk add curl git jq openssh-keygen yq zip &&\ npm install --location=global\ conventional-changelog-conventionalcommits@6.1.0\ @qiwi/multi-semantic-release@7.0.0\ semantic-release@21.0.7\ @semantic-release/changelog@6.0.3\ semantic-release-export-data@1.0.1\ @semantic-release/git@10.0.1\ @semantic-release/gitlab@9.5.1\ @semantic-release/release-notes-generator@11.0.4\ semantic-release-replace-plugin@1.2.7\ semver@7.5.4\ &&\ rm -rf /var/cache/apk/* CMD ["/bin/sh"] Note:

The versions of some of the components are not the latest ones, I try to review them from time to time but you know the saying: if it ain’t broken, don’t fix it.

Note:

The image includes some tools and modules I use on other projects like the qiwi/multi-semantic-release, a fork of semantic-release that allows to publish multiple packages from a single repository (monorepo).

I’ll probably write about how I use it on a future post.

How and when is it executed

The job that runs semantic-release is executed when new commits are added to the develop, release/#.#.# or main branches (basically when something is merged or pushed) and after all tests have passed (we don’t want to create a new version that does not compile or passes at least the unit tests).

The job is something like the following:

semantic_release: image: $SEMANTIC_RELEASE_IMAGE rules: - if: '$CI_COMMIT_BRANCH =~ /^(develop|main|release\/\d+.\d+.\d+)$/' when: always stage: release before_script: - echo "Loading scripts.sh" - . $ASSETS_DIR/scripts.sh script: - sr_gen_releaserc_json - git_push_setup - semantic-release

Where the SEMANTIC_RELEASE_IMAGE variable contains the URI of the image built using the Dockerfile above and the sr_gen_releaserc_json and git_push_setup are functions defined on the $ASSETS_DIR/scripts.sh file:

  • The sr_gen_releaserc_json function generates the .releaserc.json file using the tmpl command.
  • The git_push_setup function configures git to allow pushing changes to the repository with the semantic-release command, optionally signing them with a SSH key.
The sr_gen_releaserc_json function

The code for the sr_gen_releaserc_json function is the following:

sr_gen_releaserc_json() { # Use nodejs as default project_type project_type="${PROJECT_TYPE:-nodejs}" # REGEX to match the rc_branch name rc_branch_regex='^release\/[0-9]\+\.[0-9]\+\.[0-9]\+$' # PATHS on the local ASSETS_DIR assets_dir="${CI_PROJECT_DIR}/${ASSETS_DIR}" sr_local_plugin="${assets_dir}/local-plugin.cjs" releaserc_tmpl="${assets_dir}/releaserc.json.tmpl" pipeline_runtime_values_yaml="/tmp/releaserc_values.yaml" pipeline_values_yaml="${assets_dir}/values_${project_type}_project.yaml" # Destination PATH releaserc_json=".releaserc.json" # Create an empty pipeline_values_yaml if missing test -f "$pipeline_values_yaml" || : >"$pipeline_values_yaml" # Create the pipeline_runtime_values_yaml file echo "branch: ${CI_COMMIT_BRANCH}" >"$pipeline_runtime_values_yaml" echo "sr_local_plugin: ${sr_local_plugin}" >>"$pipeline_runtime_values_yaml" # Add the rc_branch name if we are on an rc_branch if [ "$(echo "$CI_COMMIT_BRANCH" | sed -ne "/$rc_branch_regex/{p}")" ]; then echo "rc_branch: ${CI_COMMIT_BRANCH}" >>"$pipeline_runtime_values_yaml" elif [ "$(echo "$CI_MERGE_REQUEST_SOURCE_BRANCH_NAME" | sed -ne "/$rc_branch_regex/{p}")" ]; then echo "rc_branch: ${CI_MERGE_REQUEST_SOURCE_BRANCH_NAME}" \ >>"$pipeline_runtime_values_yaml" fi # Create the releaserc_json file tmpl -f "$pipeline_runtime_values_yaml" -f "$pipeline_values_yaml" \ "$releaserc_tmpl" | jq . >"$releaserc_json" # Remove the pipeline_runtime_values_yaml file rm -f "$pipeline_runtime_values_yaml" # Print the releaserc_json file print_file_collapsed "$releaserc_json" # --*-- BEG: NOTE --*-- # Rename the package.json to ignore it when calling semantic release. # The idea is that the local-plugin renames it back on the first step of the # semantic-release process. # --*-- END: NOTE --*-- if [ -f "package.json" ]; then echo "Renaming 'package.json' to 'package.json_disabled'" mv "package.json" "package.json_disabled" fi }

Almost all the variables used on the function are defined by gitlab except the ASSETS_DIR and PROJECT_TYPE; in the complete pipelines the ASSETS_DIR is defined on a common file included by all the pipelines and the project type is defined on the .gitlab-ci.yml file of each project.

If you review the code you will see that the file processed by the tmpl command is named releaserc.json.tmpl, its contents are shown here:

{ "plugins": [ {{- if .sr_local_plugin }} "{{ .sr_local_plugin }}", {{- end }} [ "@semantic-release/commit-analyzer", { "preset": "conventionalcommits", "releaseRules": [ { "breaking": true, "release": "major" }, { "revert": true, "release": "patch" }, { "type": "feat", "release": "minor" }, { "type": "fix", "release": "patch" }, { "type": "perf", "release": "patch" } ] } ], {{- if .replacements }} [ "semantic-release-replace-plugin", { "replacements": {{ .replacements | toJson }} } ], {{- end }} "@semantic-release/release-notes-generator", {{- if eq .branch "main" }} [ "@semantic-release/changelog", { "changelogFile": "CHANGELOG.md", "changelogTitle": "# Changelog" } ], {{- end }} [ "@semantic-release/git", { "assets": {{ if .assets }}{{ .assets | toJson }}{{ else }}[]{{ end }}, "message": "ci(release): v${nextRelease.version}\n\n${nextRelease.notes}" } ], [ "@semantic-release/gitlab", { "gitlabUrl": "https://gitlab.com/", "successComment": false } ] ], "branches": [ { "name": "develop", "prerelease": "SNAPSHOT" }, {{- if .rc_branch }} { "name": "{{ .rc_branch }}", "prerelease": "rc" }, {{- end }} "main" ] }

The values used to process the template are defined on a file built on the fly (releaserc_values.yaml) that includes the following keys and values:

  • branch: the name of the current branch
  • sr_local_plugin: the path to the local plugin we use (shown later)
  • rc_branch: the name of the current rc branch; we only set the value if we are processing one because semantic-release only allows one branch to match the rc prefix and if we use a wildcard (i.e. release/*) but the users keep more than one release/#.#.# branch open at the same time the calls to semantic-release will fail for sure.

The template also uses a values_${project_type}_project.yaml file that includes settings specific to the project type, the one for nodejs is as follows:

replacements: - files: - "package.json" from: "\"version\": \".*\"" to: "\"version\": \"${nextRelease.version}\"" assets: - "CHANGELOG.md" - "package.json"

The replacements section is used to update the version field on the relevant files of the project (in our case the package.json file) and the assets section includes the files that will be committed to the repository when the release is published (looking at the template you can see that the CHANGELOG.md is only updated for the main branch, we do it this way because if we update the file on other branches it creates a merge nightmare and we are only interested on it for released versions anyway).

The local plugin adds code to rename the package.json_disabled file to package.json if present and prints the last and next versions on the logs with a format that can be easily parsed using sed:

local-plugin.cjs // Minimal plugin to: // - rename the package.json_disabled file to package.json if present // - log the semantic-release last & next versions function verifyConditions(pluginConfig, context) { var fs = require('fs'); if (fs.existsSync('package.json_disabled')) { fs.renameSync('package.json_disabled', 'package.json'); context.logger.log(`verifyConditions: renamed 'package.json_disabled' to 'package.json'`); } } function analyzeCommits(pluginConfig, context) { if (context.lastRelease && context.lastRelease.version) { context.logger.log(`analyzeCommits: LAST_VERSION=${context.lastRelease.version}`); } } function verifyRelease(pluginConfig, context) { if (context.nextRelease && context.nextRelease.version) { context.logger.log(`verifyRelease: NEXT_VERSION=${context.nextRelease.version}`); } } module.exports = { verifyConditions, analyzeCommits, verifyRelease } The git_push_setup function

The code for the git_push_setup function is the following:

git_push_setup() { # Update global credentials to allow git clone & push for all the group repos git config --global credential.helper store cat >"$HOME/.git-credentials" <<EOF https://fake-user:${GITLAB_REPOSITORY_TOKEN}@gitlab.com EOF # Define user name, mail and signing key for semantic-release user_name="$SR_USER_NAME" user_email="$SR_USER_EMAIL" ssh_signing_key="$SSH_SIGNING_KEY" # Export git user variables export GIT_AUTHOR_NAME="$user_name" export GIT_AUTHOR_EMAIL="$user_email" export GIT_COMMITTER_NAME="$user_name" export GIT_COMMITTER_EMAIL="$user_email" # Sign commits with ssh if there is a SSH_SIGNING_KEY variable if [ "$ssh_signing_key" ]; then echo "Configuring GIT to sign commits with SSH" ssh_keyfile="/tmp/.ssh-id" : >"$ssh_keyfile" chmod 0400 "$ssh_keyfile" echo "$ssh_signing_key" | tr -d '\r' >"$ssh_keyfile" git config gpg.format ssh git config user.signingkey "$ssh_keyfile" git config commit.gpgsign true fi }

The function assumes that the GITLAB_REPOSITORY_TOKEN variable (set on the CI/CD variables section of the project or group we want) contains a token with read_repository and write_repository permissions on all the projects we are going to use this function.

The SR_USER_NAME and SR_USER_EMAIL variables can be defined on a common file or the CI/CD variables section of the project or group we want to work with and the script assumes that the optional SSH_SIGNING_KEY is exported as a CI/CD default value of type variable (that is why the keyfile is created on the fly) and git is configured to use it if the variable is not empty.

Warning:

Keep in mind that the variables GITLAB_REPOSITORY_TOKEN and SSH_SIGNING_KEY contain secrets, so probably is a good idea to make them protected (if you do that you have to make the develop, main and release/* branches protected too).

Warning:

The semantic-release user has to be able to push to all the projects on those protected branches, it is a good idea to create a dedicated user and add it as a MAINTAINER for the projects we want (the MAINTAINERS need to be able to push to the branches), or, if you are using a Gitlab with a Premium license you can use the api to allow the semantic-release user to push to the protected branches without allowing it for any other user.

The semantic-release command

Once we have the .releaserc file and the git configuration ready we run the semantic-release command.

If the branch we are working with has one or more commits that will increment the version, the tool does the following (note that the steps are described are the ones executed if we use the configuration we have generated):

  1. It detects the commits that will increment the version and calculates the next version number.
  2. Generates the release notes for the version.
  3. Applies the replacements defined on the configuration (in our example updates the version field on the package.json file).
  4. Updates the CHANGELOG.md file adding the release notes if we are going to publish the file (when we are on the main branch).
  5. Creates a commit if all or some of the files listed on the assets key have changed and uses the commit message we have defined, replacing the variables for their current values.
  6. Creates a tag with the new version number and the release notes.
  7. As we are using the gitlab plugin after tagging it also creates a release on the project with the tag name and the release notes.
Notes about the git workflows and merges between branches

It is very important to remember that semantic-release looks at the commits of a given branch when calculating the next version to publish, that has two important implications:

  1. On pre release branches we need to have the commit that includes the tag with the released version, if we don’t have it the next version is not calculated correctly.
  2. It is a bad idea to squash commits when merging a branch to another one, if we do that we will lose the information semantic-release needs to calculate the next version and even if we use the right prefix for the squashed commit (fix, feat, …​) we miss all the messages that would otherwise go to the CHANGELOG.md file.

To make sure that we have the right commits on the pre release branches we should merge the main branch changes into the develop one after each release tag is created; in my pipelines the fist job that processes a release tag creates a branch from the tag and an MR to merge it to develop.

The important thing about that MR is that is must not be squashed, if we do that the tag commit will probably be lost, so we need to be careful.

To merge the changes directly we can run the following code:

# Set the SR_TAG variable to the tag you want to process SR_TAG="v1.3.2" # Fetch all the changes git fetch --all --prune # Switch to the main branch git switch main # Pull all the changes git pull # Switch to the development branch git switch develop # Pull all the changes git pull # Create followup branch from tag git switch -c "followup/$SR_TAG" "$SR_TAG" # Change files manually & commit the changed files git commit -a --untracked-files=no -m "ci(followup): $SR_TAG to develop" # Switch to the development branch git switch develop # Merge the followup branch into the development one using the --no-ff option git merge --no-ff "followup/$SR_TAG" # Remove the followup branch git branch -d "followup/$SR_TAG" # Push the changes git push

If we can’t push directly to develop we can create a MR pushing the followup branch after committing the changes, but we have to make sure that we don’t squash the commits when merging or it will not work as we want.

Note:

We haven’t discussed the release/#.#.# branches because our assumption is that they don’t exist after a release is published (the branch is deleted after merging it to main) and new release candidate branches will be created from develop and the commits included with the followup will already be present.

Categories: FLOSS Project Planets

John Goerzen: The Grumpy Cricket (And Other Enormous Creatures)

Planet Debian - Mon, 2023-12-25 15:23

This Christmas, one of my gifts to my kids was a text adventure (interactive fiction) game for them. Now that they’ve enjoyed it, I’m releasing it under the GPL v3.

As interactive fiction, it’s like an e-book, but the reader is also the player, guiding the exploration of the world.

The Grumpy Cricket is designed to be friendly for a first-time player of interactive fiction. There is no way to lose the game or to die. There is an in-game hint system providing context-sensitive hints anytime the reader types HINT. There are splashes of humor throughout that got all three of my kids laughing.

I wrote it in 2023 for my kids, which range in age from 6 to 17. That’s quite a wide range, but they all were enjoying it.

You can download it, get the source, or play it online in a web browser at https://www.complete.org/the-grumpy-cricket/

Categories: FLOSS Project Planets

Pages