Feeds

Real Python: Python News Roundup: July 2024

Planet Python - Mon, 2024-07-08 10:00

Summer isn’t all holidays and lazy days at the beach. Over the last month, two important players in the data science ecosystem released new major versions. NumPy published version 2.0, which comes with several improvements but also some breaking changes. At the same time, Polars reached its version 1.0 milestone and is now considered production-ready.

PyCon US was hosted in Pittsburgh, Pennsylvania in May. The conference is an important meeting spot for the community and sparked some new ideas and discussions. You can read about some of these in PSF’s coverage of the Python Language Summit, and watch some of the videos posted from the conference.

Dive in to learn more about the most important Python news from the last month.

NumPy Version 2.0

NumPy is a foundational package in the data science space. The library provides in-memory N-dimensional arrays and many functions for fast operations on those arrays.

Many libraries in the ecosystem use NumPy under the hood, including pandas, SciPy, and scikit-learn. The NumPy package has been around for close to twenty years and has played an important role in the rising popularity of Python among data scientists.

The new version 2.0 of NumPy is an important milestone, which adds an improved string type, cleans up the library, and improves performance. However, it comes with some changes that may affect your code.

The biggest breaking changes happen in the C-API of NumPy. Typically, this won’t affect you directly, but it can affect other libraries that you rely on. The community has rallied strongly and most of the bigger packages already support NumPy 2.0. You can check NumPy’s table of ecosystem support for details.

One of the main reasons for using NumPy is that the library can do fast and convenient array operations. For a simple example, the following code calculates square numbers:

Python >>> numbers = range(10) >>> [number**2 for number in numbers] [0, 1, 4, 9, 16, 25, 36, 49, 64, 81] >>> import numpy as np >>> numbers = np.arange(10) >>> numbers**2 array([ 0, 1, 4, 9, 16, 25, 36, 49, 64, 81]) Copied!

First, you use range() and a list comprehension to calculate the first ten square numbers in pure Python. Then, you repeat the calculation with NumPy. Note that you don’t need to explicitly spell out the loop. NumPy handles that for you under the hood.

Furthermore, the NumPy version will be considerably faster, especially for bigger arrays of numbers. One of the secrets to this speed is that NumPy arrays are limited to having one data type, while a Python list can be heterogeneous. One list can contain elements as different as integers, floats, strings, and even nested lists. That’s not possible in a NumPy array.

Improved String Handling

By enforcing all elements to be of the same type that take up the same number of bytes in memory, NumPy can quickly find and work with individual elements. One downside to this has been that strings can be awkward to work with:

Python >>> words = np.array(["numpy", "python"]) >>> words array(['numpy', 'python'], dtype='<U6') >>> words[1] = "monty python" >>> words array(['numpy', 'monty '], dtype='<U6') Copied!

You first create an array consisting of two strings. Note that NumPy automatically detects that the longest string is six characters long, so it sets aside space for each string to be six characters long. The 6 in the data type string, <U6, indicates this.

Next, you try to replace the second string with a longer string. Unfortunately, only the first six characters are stored since that’s how much space NumPy has set aside for each string in this array. There are ways to work around these limitations, but in NumPy 2.0, you can take advantage of variable length strings instead:

Read the full article at https://realpython.com/python-news-july-2024/ »

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Salsa Digital: CivicTheme establishes a formal steering committee to guide its rapid growth

Planet Drupal - Mon, 2024-07-08 08:00
CivicTheme establishes a formal steering committee to guide its rapid growth We’re very excited to announce that we’ve organised the official CivicTheme Steering Committee (aka Steerco) to help shape the future of the CivicTheme open-source project . In this article, we provide an overview of the steering committee and its objectives, as well as how you can get involved with the CivicTheme project. Our seed committee We have been planning the committee's formation for some time and have now created the inaugural “seed committee” to get things focused and back on track.
Categories: FLOSS Project Planets

roose.digital: How to prefill a webform field based on the node/URL being viewed in Drupal 10

Planet Drupal - Mon, 2024-07-08 07:33
In this simple tutorial I will show you how you can fill a webform field with a value based on the node the visitor is viewing.
Categories: FLOSS Project Planets

drunomics: Stop making your own life difficult. Integrate your CI/CD Pipeline with these monitoring tools

Planet Drupal - Mon, 2024-07-08 07:28
Stop making your own life difficult. Integrate your CI/CD Pipeline with these monitoring tools ddd.png jeremy.chinquist Mon, 07/08/2024 - 13:28 If you do not use a CI/CD pipeline, we highly recommend it. For the past several years our team has set up projects in a CI/CD pipeline using Jenkins or Github actions. The tools that we will present in this session have proven to be a good set of automated testing tools for several projects. Over time we have chosen to move away from previous solutions and to new testing frameworks. As part of our commitment to high standards of performance and quality, we are always on the lookout for the best tools available on the market.
Categories: FLOSS Project Planets

Anwesha Das: Euro Python 2024

Planet Python - Mon, 2024-07-08 05:56

It is July, and it is time for Euro Python, and 2024 is my first Euro Python. Some busy days are on the way. Like every other conference, I have my diary, and the conference days are full of various activities.

Day 0 of the main conference

After a long time, I will give a legal talk. We are going to dig into some basics of Intellectual Property. What is it? Why do we need it? What are the different kinds of intellectual property? It is a legal talk designed for developers. So, anyone and everyone from the community with previous knowledge can understand the content and use it to understand their fundamental rights and duties as developers.Intellectual Property 101, the talk is scheduled at 11:35 hrs.

Day 1 of the main conference

Day 1 is PyLadies Day, a day dedicated to PyLadies. We have crafted the day with several different kinds of events. The day opens with a self-defense workshop at 10:30 hrs. PyLadies, throughout the world, aims to provide and foster a safe space for women and friends in the Python Community. This workshop is an extension of that goal. We will learn how to deal with challenging, inappropriate behavior.
In the community, at work, or in any social space. We will have a trained Psychologist as a session guide to help us. This workshop is so important, especially today as it was yesterday and may be in the future (at least until the enforcement of CoC is clear). I am so looking forward to the workshop. Thank you, Mia, Lias and all the PyLadies for organizing this and giving shape to my long-cherished dream.

Then we have my favorite part of the conference, PyLadies Lunch. I crafted the afternoon with a little introduction session, shout-out session, food, fun, laughter, and friends.

After the PyLadies Lunch, I have my only non-PyLadies session, which is a panel discussion on Open Source Sustainability. We will discuss the different aspects of sustainability in the open source space and community.

Again, it is PyLady&aposs time. Here, we have two sessions.

[IAmRemarkable](https://ep2024.europython.eu/pyladies-events#iamremarkable), to help you learn to empower you by celebrating your achievements and to fight your impostor syndrome. The workshop will help you celebrate your accomplishments and improve your self-promotion skills.

The second session is a 1:1 mentoring event, Meet & Greet with PyLadies. Here, the willing PyLadies will be able to mentor and be mentored. They can be coached in different subjects, starting with programming, learning, things related to job and/or career, etc.

Birds of feather session on Release Management of Open Source projects

It is an open discussion related to the release Management of the Open Source ecosystem.
The discussion includes everything from a community-led project to projects maintained/initiated by a big enterprise, a project maintained by one contributor to a project with several hundreds of contributor bases. What are the different methods we follow regarding versioning, release cadence, and the process itself? Do most of us follow manual processes or depend on automated ones? What works and what does not, and how can we improve our lives? What are the significant points that make the difference? We will discuss and cover the following topics: release management of open source projects, security, automation, CI usage, and documentation. In the discussion, I will share my release automation journey with Ansible. We will follow Chatham House Rules during the discussion to provide the space for open, frank, and collaborative conversation.

So, here comes the days of code, collaboration, and community. See you all there.

PS: I miss my little Py-Lady volunteering at the booth.

Categories: FLOSS Project Planets

joshics.in: Mastering Multi-Site Configurations in Drupal: A Comprehensive Guide

Planet Drupal - Mon, 2024-07-08 05:56
Mastering Multi-Site Configurations in Drupal: A Comprehensive Guide bhavinhjoshi Mon, 07/08/2024 - 15:26

In the constantly evolving digital world, the ability to efficiently manage multiple websites has become a necessity for businesses of all sizes. 

Thankfully, Drupal, an open-source content management system, has made this simpler with its multi-site configuration feature. 

This functionality makes it easier to handle numerous websites from a single Drupal installation, saving time, effort, and resources. But how do we configure this feature in Drupal? 

This blog post explores the ways to achieve multi-site configurations in Drupal in thorough detail.

Understanding Drupal Multi-Site Configurations

Before we go deeper, let's understand what Drupal multi-site configuration means. Simply put, it allows you to run multiple websites from one codebase. Each website can have its own content, settings, enabled modules, and themes, while sharing the core code, contributed modules, and themes. This arrangement benefits website managers who manage multiple sites, as they can apply updates to all at once.

How to Set Up Multi-Site Configurations
  1. Creating Sub-Directories
    The first step is to create sub-directories for each site in the 'sites' directory. This is where individual settings for each site reside. The directory name would typically be your site's URL. For instance, if your site's URL is 'example.com', the directory name would be 'sites/example.com'.
  2. Setting Up the Database
    Each site requires its own database. During Drupal installation, you need to set up a new database for each site. Remember to collate each database in 'utf8mb4_general_ci' to avoid any characters failing to write to the database.
  3. Configuring Settings.php
    For each site, you will need a settings.php file. This file contains critical information about your site such as base URL, database credentials, and more. You can find a default.settings.php file in the 'default' directory. Copy this file into your new site directory and rename it to 'settings.php'. Update the necessary details like the database name, username, and password.
  4. Configuring the Web Server
    Next, you need to configure your web server to point to the correct site directory. For Apache servers, you would use the .htaccess file, while nginx servers use the nginx.conf file.
  5. Installing Drupal
    Finally, install Drupal for each site by navigating to your site's URL in a web browser. Follow the installation prompts, and in no time, your website will be up and running.
The Importance of Multi-Site Configurations

With multi-site configurations, you can centralise your web management tasks, reducing the need for redundant tasks. You can apply core updates, security patches, and other changes across all your sites with a single stroke. This translates into reduced effort, time, and risk of errors.

Further, this simplifies your hosting environment as you're using a single codebase, making it easier to manage your server resources and optimise for performance.

Pitfalls to Avoid

Despite its numerous benefits, multi-site configurations are not without their challenges. Remember, changes made are site-wide; an update beneficial to one site might disrupt another. Thus, always carry out extensive testing before deploying changes. Additionally, ensure to maintain regular backups to quickly restore any problematic updates.

Conclusion

Mastering Drupal's multi-site configurations can become an asset in your digital arsenal. It not only optimises resources but also streamlines your web management process. However, it requires strategic planning and careful execution to exploit its full potential.

Drupal Multi-site Drupal Planet
Categories: FLOSS Project Planets

mark.ie: Want to contribute to LocalGov Drupal, but don't know where to start?

Planet Drupal - Mon, 2024-07-08 05:08

Contributing to LocalGov Drupal is a great way to get used to how the platform works, and, in turn, makes the platform better for everyone. Here's some thoughts on how to get started.

Categories: FLOSS Project Planets

Talk Python to Me: #469: PuePy: Reactive frontend framework in Python

Planet Python - Mon, 2024-07-08 04:00
Python is one of the most popular languages of the current era. It dominates data science, it an incredible choice for web development, and its many people's first language. But it's not super great on front-end programing, is it? Frameworks like React, Vue and other JavaScript frameworks rule the browser and few other languages even get a chance to play there. But with pyscript, which I've covered several times on this show, we have the possibility of Python on the front end. Yet it's not really a front end framework, just a runtime in the browser. That's why I'm excited to have Ken Kinder on the podcast to talk about his project PuePy, a reactive frontend framework in Python.<br/> <br/> <strong>Episode sponsors</strong><br/> <br/> <a href='https://talkpython.fm/sentry'>Sentry Error Monitoring, Code TALKPYTHON</a><br> <a href='https://talkpython.fm/code-comments'>Code Comments</a><br> <a href='https://talkpython.fm/training'>Talk Python Courses</a><br/> <br/> <strong>Links from the show</strong><br/> <br/> <div><b>Michael's Code in a Castle Course</b>: <a href="https://talkpython.fm/castle" target="_blank" rel="noopener">talkpython.fm/castle</a><br/> <br/> <b>Ken Kinder</b>: <a href="https://twit.social/@bouncing" target="_blank" rel="noopener">@bouncing@twit.social</a><br/> <b>PuePy</b>: <a href="https://puepy.dev/" target="_blank" rel="noopener">puepy.dev</a><br/> <b>PuePy Docs</b>: <a href="https://docs.puepy.dev/" target="_blank" rel="noopener">docs.puepy.dev</a><br/> <b>PuePy on Github</b>: <a href="https://github.com/kkinder/puepy" target="_blank" rel="noopener">github.com</a><br/> <b>pyscript</b>: <a href="https://pyscript.net" target="_blank" rel="noopener">pyscript.net</a><br/> <b>VueJS</b>: <a href="https://vuejs.org" target="_blank" rel="noopener">vuejs.org</a><br/> <b>Hello World example</b>: <a href="https://docs.puepy.dev/hello-world.html" target="_blank" rel="noopener">docs.puepy.dev</a><br/> <b>Tutorial</b>: <a href="https://docs.puepy.dev/tutorial.html" target="_blank" rel="noopener">docs.puepy.dev</a><br/> <b>Tutorial running at pyscript.com</b>: <a href="https://pyscript.com/@kkinder/puepy-tutorial/latest" target="_blank" rel="noopener">pyscript.com</a><br/> <b>Micropython</b>: <a href="https://micropython.org" target="_blank" rel="noopener">micropython.org</a><br/> <b>Pyodide</b>: <a href="https://pyodide.org/en/stable/" target="_blank" rel="noopener">pyodide.org</a><br/> <b>PgQueuer</b>: <a href="https://github.com/janbjorge/PgQueuer" target="_blank" rel="noopener">github.com</a><br/> <b>Writerside</b>: <a href="https://www.jetbrains.com/writerside/" target="_blank" rel="noopener">jetbrains.com</a><br/> <br/> <b>Michael's PWA pyscript app</b>: <a href="https://github.com/mikeckennedy/pyscript-pwa-example" target="_blank" rel="noopener">github.com</a><br/> <b>Michael's demo of a PWA pyscript app</b>: <a href="https://www.youtube.com/watch?v=lC2jUeDKv-s" target="_blank" rel="noopener">youtube.com</a><br/> <b>Python iOS Web App with pyscript and offline PWAs video</b>: <a href="https://www.youtube.com/watch?v=Nct0usblj64" target="_blank" rel="noopener">youtube.com</a><br/> <b>Watch this episode on YouTube</b>: <a href="https://www.youtube.com/watch?v=-mbVh24qQmA" target="_blank" rel="noopener">youtube.com</a><br/> <b>Episode transcripts</b>: <a href="https://talkpython.fm/episodes/transcript/469/puepy-reactive-frontend-framework-in-python" target="_blank" rel="noopener">talkpython.fm</a><br/> <br/> <b>--- Stay in touch with us ---</b><br/> <b>Subscribe to us on YouTube</b>: <a href="https://talkpython.fm/youtube" target="_blank" rel="noopener">youtube.com</a><br/> <b>Follow Talk Python on Mastodon</b>: <a href="https://fosstodon.org/web/@talkpython" target="_blank" rel="noopener"><i class="fa-brands fa-mastodon"></i>talkpython</a><br/> <b>Follow Michael on Mastodon</b>: <a href="https://fosstodon.org/web/@mkennedy" target="_blank" rel="noopener"><i class="fa-brands fa-mastodon"></i>mkennedy</a><br/></div>
Categories: FLOSS Project Planets

Zato Blog: Integrating with WordPress and Elementor API webhooks

Planet Python - Mon, 2024-07-08 03:43
Integrating with WordPress and Elementor API webhooks 2024-07-08, by Dariusz Suchojad Overview

Consider this scenario:

  • You have a WordPress instance, possibly installed in your own internal network
  • With WordPress, you use Elementor, a popular website builder
  • A user fills out a form that you prepared using Elementor, e.g. the user provides his or her email and username to create an account in your CRM
  • Now, after WordPress processes this information accordingly, you also need to send it all to a remote backend system that only accepts JSON messages

The concern here is that WordPress alone will not send it to the backend system.

Hence, we are going to use an Elementor-based webhook that will invoke Zato which will be acting as an integration layer. In Zato, we will use Python to transform the results of what was submitted in the form - in order to deliver it to the backend API system using a REST call.

Creating a channel

A Zato channel is a way to describe the configuration of a particular API endpoint. In this case, to accept data from WordPress, we are going to use REST channels:

In the screenshot below, note particularly the highlighted data format field. Typically, REST channels will use JSON, but here, we need to use "Form data" because this is what we are getting from Elementor.

Now, we can add the actual code to accept the data and to communicate with the remote, backend system.

Python code

Here is the Python code and what follows is an explanation of how it works:

# -*- coding: utf-8 -*- # Zato from zato.server.service import Service # Field configuration field_email = 'fields[email][value]' field_username = 'fields[username][value]' class CreateAccount(Service): # The input that we expect from WordPress, i.e. what fields it needs to send input = field_email, field_username def handle(self): # This is a dictionary object with data from WordPress .. input = self.request.input # .. so we can use dictionary access to extract values received .. email = input[field_email] username = input[field_username] # .. now, we can create a JSON request for the backend system .. # .. again, using a regular Python dictionary .. api_request = { 'Email': email 'Username': username } # Obtain a connection to the backend system .. conn = self.out.rest['CRM'].conn # .. invoke that system .. # Invoke the resource providing all the information on input response = conn.post(self.cid, api_request) # .. and log the response received .. self.logger.info('Backend response -> %s', response.data)

The format of data that Elementor will use is of a specific nature. It is not JSON and the field names are not sent directly either.

That is, if your form has fields such as "email" and "username", what the webhook sends is named differently. The names will be, respectively:

  • fields[email][value]
  • fields[username][value]

If it were JSON, we could say that instead of this ..

{ "email": "hello@example.com", "username": "hello" }

.. the webhook was sending to you that:

{ "fields[email][value]": "hello@example.com", "fields[username][value]": "hello" }

The above format explains why in the Python code below we are extracting all the input fields from WordPress using the "self.request.input" object using its dictionary access syntax method.

Normally, if the field was plain "username", we would be doing "self.request.input.username" but this is not available in this case because of the naming conventions of the fields from Elementor.

Now, the only remaining part is the definition of the outgoing REST connection that service should use.

Outgoing REST connections

Create an outgoing REST connection as below - this time around, note that the data format is JSON.

Using the REST channel

In your WordPress dashboard, create a webhook using Elementor and point it to the channel created earlier, e.g. make Elementor invoke an address such as http://10.157.11.39:11223/api/wordpress/create-account

Each time a form is submitted, its contents will go to Zato, your service will transform it to JSON and the backend CRM system will be invoked.

And this is everything - you have just integrated WordPress, Elementor webhooks and an external API backend system in Python.

More resources

➤ Python API integration tutorial
What is an integration platform?
Python Integration platform as a Service (iPaaS)
What is an Enterprise Service Bus (ESB)? What is SOA?

More blog posts
Categories: FLOSS Project Planets

Dirk Eddelbuettel: RcppArmadillo 14.0.0-1 on CRAN: New Upstream

Planet Debian - Mon, 2024-07-08 02:39

Armadillo is a powerful and expressive C++ template library for linear algebra and scientific computing. It aims towards a good balance between speed and ease of use, has a syntax deliberately close to Matlab, and is useful for algorithm development directly in C++, or quick conversion of research code into production environments. RcppArmadillo integrates this library with the R environment and language–and is widely used by (currently) 1158 other packages on CRAN, downloaded 35.1 million times (per the partial logs from the cloud mirrors of CRAN), and the CSDA paper (preprint / vignette) by Conrad and myself has been cited 587 times according to Google Scholar.

Conrad released a new major upstream version 14.0.0 a couple of days ago. We had been testing this new version extensively over several rounds of reverse-dependency checks across all 1100+ packages. This revealed nine packages requiring truly minor adjustments—which eight maintainers made in a matter of days; all this was coordinated in issue #443. Following the upload, CRAN noticed one more issue (see issue #446) but this turned out to be local to the package. There are also renewed deprecation warnings with some Armadillo changes which we will need to address one-by-one. Last but not least with this release we also changed the package versioning scheme to follow upstream Armadillo more closely.

The set of changes since the last CRAN release follows.

Changes in RcppArmadillo version 14.0.0-1 (2024-07-05)
  • Upgraded to Armadillo release 14.0.0 (Stochastic Parrot)

    • C++14 is now the minimum recommended C++ standard

    • Faster handling of compound expressions by as_scalar(), accu(), dot()

    • Faster interactions between sparse and dense matrices

    • Expanded stddev() to handle sparse matrices

    • Expanded relational operators to handle expressions between sparse matrices and scalars

    • Added .as_dense() to obtain dense vector/matrix representation of any sparse matrix expression

    • Updated physical constants to NIST 2022 CODATA values

  • New package version numbering scheme following upstream versions

  • Re-enabling ARMA_IGNORE_DEPRECATED_MARKE for silent CRAN builds

Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the Rcpp R-Forge page.

If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Categories: FLOSS Project Planets

joshics.in: Journey into Drupal: Becoming a Contributor That Counts!

Planet Drupal - Mon, 2024-07-08 00:53
Journey into Drupal: Becoming a Contributor That Counts! bhavinhjoshi Mon, 07/08/2024 - 10:23

Drupal, the renowned content management platform, is a result of collective efforts by developers and contributors worldwide.

But have you ever considered how you can contribute to its evolution? Let's unpack the benefits of contributing, explore avenues to contribute and decipher tools that ease the process.

The Benefits of Contributing

Contributing to Drupal is a symbiotic process, where every effort you put in reaps multiple rewards:

1. Skill Enhancement: Whether you choose to contribute code, help with testing, design or translations, each contribution is an opportunity to sharpen your skills. 

2. Networking: Engage with the global Drupal community at meetups, workshops, and forums. These platforms present a potent opportunity to connect with fellow developers, learn about Drupal trends, and imbibe best practices.

3. Impact: Every contribution counts. Whether it's contributing code, reporting bugs, suggesting enhancements, or sharing expertise and insights in the community, each contribution plays a pivotal role. These collective contributions help in creating a more robust, user-friendly, and advanced platform. Thus, every contribution, regardless of its size or nature, holds immense value as it brings about change and propels Drupal forward in its ongoing evolution.

Contribute to Drupal: The How-To Guide

There's a multitude of ways to contribute to Drupal. Here's a few:

1. Code Development: If you're proficient in PHP, consider developing for Drupal core or contributing modules. 

2. Testing: Quality assurance plays a crucial role in Drupal development. Test new features or updates, and report bugs. 

3. Documentation: A clear, comprehensive documentation bolsters user experience and uptake.

4. Community Support: Help others on Drupal forums or organise Drupal events. 

Tools That Make Drupal Contribution Easy

Several tools can support your journey as a Drupal contributor:

1. Drupal.org: A treasure trove of resources, Drupal.org is your starting point.

2. 'Simply Test Me': This tool lets you test patches or projects before submission. 

3. Drupal CI: This integrated tool helps test your code's compatibility with different environments. 

4. Dreditor: This browser extension streamlines code review and submission. It aids in identifying errors within the code, ensuring your contributions are more sturdy and resilient.

5. Slack: Join the Drupal Slack workspace for ongoing discussions and mentorship. 

In conclusion, contributing to Drupal is not just about giving back to the community but also about enhancing your skills, creating influential networks, and leaving a lasting impact. Remember, every contribution matters, no matter how small. 

So, wear your Drupal contributor hat and let’s keep the wheel of innovation turning. Together, we can shape the future of this robust platform. Every step we take, every effort we make, brings Drupal a step closer to the epitome of perfection. 

It’s a journey, and everyone’s invited. After all, the best way to predict the future is to create it. Let's create the Drupal we want, together!

Drupal Community Drupal Planet
Categories: FLOSS Project Planets

Russ Allbery: rra-c-util 11.0.0

Planet Debian - Sun, 2024-07-07 22:36

rra-c-util is my collection of utility and test functions that I keep synchronized between my packages. The big change in this release is that I've switched to semantic versions, which I plan to do for all of my packages, and I've started using scriv to manage a Markdown change log. We've been using scriv for a while at work, so I have finally gotten on board the change log fragment train rather than assuming linear commits.

This release also raises the minimum Perl version for all of the Perl support code to 5.12 so that I can use semantic versions for all modules, and updates the perltidy configuration with lots of improvements from Julien ÉLIE. There are also the normal variaty of bug fixes and more minor improvements.

You can get the latest version from the rra-c-util distribution page.

Categories: FLOSS Project Planets

Russ Allbery: Review: Beyond Control

Planet Debian - Sun, 2024-07-07 22:07

Review: Beyond Control, by Kit Rocha

Series: Beyond #2 Publisher: Kit Rocha Copyright: December 2013 ASIN: B00GIA4GN8 Format: Kindle Pages: 364

Beyond Control is science fiction erotica (dystopian erotic romance, per the marketing) and a direct sequel to Beyond Shame. These books shift protagonists with each volume and enough of the world background is explained that you could start here, but there are significant spoilers for the previous book. I read this book as part of the Beyond Series Bundle (Books 1-3), which is what the sidebar information is for.

This is one of those reviews that I write because I'm stubborn about reviewing all the books I read, not because it's likely to be useful to anyone. There are also considerably more spoilers for the shape of the story than I normally include, so be warned.

The Beyond series is erotica. Specifically, so far, consensual BDSM erotica with bisexuality but otherwise typical gender stereotypes. The authors (Kit Rocha is a pen name for Donna Herren and Bree Bridges) are women, so it's more female gaze than male gaze, but by erotica I don't mean romance with an above-average number of steamy scenes. I mean it felt like half the book by page count was descriptions of sex.

This review is rather pointless because, one, I'm not going to review the sex that's the main point of the book, and two, I skimmed all the sex and read it for the story because I'm weird. Beyond Shame got me interested in these absurdly horny people and their post-apocalyptic survival struggles in the outskirts of a city run by a religious surveillance state, and I wanted to find out what happened next. Besides, this book promised to focus on my favorite character from the first novel, Lex, and I wanted to read more about her.

Beyond Control uses a series pattern that I understand is common in romance but which is not often seen in SFF (my usual genre): each book focuses on a new couple adjacent to the previous couple, while the happily ever after of the previous couple plays out in the background. In this case, it also teases the protagonists of the next book. I can see why romance uses this structure: it's an excuse to provide satisfying interludes for the reader. In between Lex and Dallas's current relationship problems, one gets to enjoy how well everything worked out for Noelle and how much she's grown.

In Beyond Shame, Lex was the sort-of partner of Dallas O'Kane, the leader of the street gang that is running Sector Four. (Picture a circle surrounding the rich-people-only city of Eden. That circle is divided into eight wedge-shaped sectors, which provide heavy industries, black-market pleasures, and slums for agricultural workers.) Dallas is an intensely possessive, personally charismatic semi-dictator who cultivates the image of a dangerous barbarian to everyone outside and most of the people inside Sector Four. Since he's supposed to be one of the good guys, this is more image than reality, but it's not entirely disconnected from reality.

This book is about Lex and Dallas forming an actual relationship, instead of the fraught and complicated thing they had in the first book. I was hoping that this would involve Dallas becoming less of an asshole. It unfortunately does not, although some of what I attributed to malice may be adequately explained by stupidity. I'm not sure that's an improvement.

Lex is great, just like she was in the first book. It's obvious by this point in the series that she does most of the emotional labor of keeping the gang running, and her support is central to Dallas's success. Like most of the people in this story, she has a nasty and abusive background that she's still dealing with in various ways. Dallas's possessiveness is intensely appealing to her, but she wants that possessiveness on different terms than Dallas may be willing to offer, or is even aware of. Lex was, I thought, exceptionally clear about what she wanted out of this relationship. Dallas thinks this is entirely about sex, and is, in general, dumber than a sack of hammers. That means fights. Also orgies, but, well, hopefully you knew what you were getting into if you picked up this book.

I know, I know, it's erotica, that's the whole point, but these people have a truly absurd amount of sex. Eden puts birth control in the water supply, which is a neat way to simplify some of the in-story consequences of erotica. They must be putting aphrodisiacs in the water supply as well.

There was a lot of sector politics in this book that I found way more interesting than it had any right to be. I really like most of these people, even Dallas when he manages to get his three brain cells connected for more than a few minutes. The events of the first book have a lot of significant fallout, Lex continues being a badass, the social dynamics between the women are very well-done (and pass the Bechdel test yet again even though this is mostly traditional-gender-role erotica), and if Dallas had managed to understand what he did wrong at a deeper-than-emotional level, I would have rather enjoyed the non-erotica story parts. Alas.

I therefore wouldn't recommend this book even if I were willing to offer any recommendations about erotica (which I'm not). I was hoping it was going somewhere more rewarding than it did. But I still kind of want to read another one? I am weirdly fascinated with the lives of these people. The next book is about Six, who has the potential to turn into the sort of snarky, cynical character I love reading about. And it's not that hard to skim over the orgies. Maybe Dallas will get one additional brain cell per book?

Followed by Beyond Pain.

Rating: 5 out of 10

Categories: FLOSS Project Planets

Wingware: Wing Python IDE Version 10.0.5 - July 8, 2024

Planet Python - Sun, 2024-07-07 21:00

Wing 10.0.5 adds support for running the IDE on arm64 Linux, updates the German language UI localization, changes the default OpenAI model to lower cost and better performing gpt-4o, and fixes several bugs.

See the change log for details.

Download Wing 10 Now: Wing Pro | Wing Personal | Wing 101 | Compare Products


What's New in Wing 10

AI Assisted Development

Wing Pro 10 takes advantage of recent advances in the capabilities of generative AI to provide powerful AI assisted development, including AI code suggestion, AI driven code refactoring, description-driven development, and AI chat. You can ask Wing to use AI to (1) implement missing code at the current input position, (2) refactor, enhance, or extend existing code by describing the changes that you want to make, (3) write new code from a description of its functionality and design, or (4) chat in order to work through understanding and making changes to code.

Examples of requests you can make include:

"Add a docstring to this method" "Create unit tests for class SearchEngine" "Add a phone number field to the Person class" "Clean up this code" "Convert this into a Python generator" "Create an RPC server that exposes all the public methods in class BuildingManager" "Change this method to wait asynchronously for data and return the result with a callback" "Rewrite this threaded code to instead run asynchronously"

Yes, really!

Your role changes to one of directing an intelligent assistant capable of completing a wide range of programming tasks in relatively short periods of time. Instead of typing out code by hand every step of the way, you are essentially directing someone else to work through the details of manageable steps in the software development process.

Read More

Support for Python 3.12 and ARM64 Linux

Wing 10 adds support for Python 3.12, including (1) faster debugging with PEP 669 low impact monitoring API, (2) PEP 695 parameterized classes, functions and methods, (3) PEP 695 type statements, and (4) PEP 701 style f-strings.

Wing 10 also adds support for running Wing on ARM64 Linux systems.

Poetry Package Management

Wing Pro 10 adds support for Poetry package management in the New Project dialog and the Packages tool in the Tools menu. Poetry is an easy-to-use cross-platform dependency and package manager for Python, similar to pipenv.

Ruff Code Warnings & Reformatting

Wing Pro 10 adds support for Ruff as an external code checker in the Code Warnings tool, accessed from the Tools menu. Ruff can also be used as a code reformatter in the Source > Reformatting menu group. Ruff is an incredibly fast Python code checker that can replace or supplement flake8, pylint, pep8, and mypy.


Try Wing 10 Now!

Wing 10 is a ground-breaking new release in Wingware's Python IDE product line. Find out how Wing 10 can turbocharge your Python development by trying it today.

Downloads: Wing Pro | Wing Personal | Wing 101 | Compare Products

See Upgrading for details on upgrading from Wing 9 and earlier, and Migrating from Older Versions for a list of compatibility notes.

Categories: FLOSS Project Planets

Calamares ABI Checking

Planet KDE - Sun, 2024-07-07 18:00

Seems like over 3 years ago I wrote something about ABI stability checking and investigated a little how tools could be used to help maintain ABI stability for Calamares. Here are some callback notes.

Tooling Choices

I ended up using abigail for ABI checking, because it’s available on FreeBSD and Ubuntu and just seems like it has a compatible mindset for what I want to do: tell me the ABI difference between two Calamares releases.

The tool for that is abidiff and applying the tool is literally a matter of getting two versions of a shared object (.so) and running diff.

In Calamares I put together some scripts to automate this, since building versions of the Calamares shared objects is not quite trivial. The scripts also help out with the ABI-stability promise (or, um .. pinky swear, maybe).

Calamares ABI Stability Policy

When the Calamares 3.3 series started, one of the ideas was that in 3.3, the ABI should be stable, so that external / third party modules for Calamares should be able to keep working without a recompile. During 3.2 development, things were very unstable and everything needed to be recompiled all the time – and having a Boost dependency didn’t help much either, since there are so many distro’s with poor .so hygiene.

So the idea was to keep ABI stability, and to check that it was so.

Unfortunately, 3.3.0 released with a wide-open ABI because I forgot to turn on hidden-visibility by default, and there were a lot of not-quite-there cleanups that still needed doing.

By the time 3.3.3 came out, three months later, things were in better shape.

Right now, I’ve defined 3.3.3 as “the stable starting point”, and check ABI compatibility against that every now-and-again. Not regularly, not as part of releases (I suppose I would accept a PR that added a check that enforces it as part of the release script).

Anyway, the idea is that there should be “minimal” ABI changes. That is a lot less strict than, say, the KDE Frameworks Policies about C++ compatibility (hm .. those pages talk a lot more about kdelibs and KDE4 than I would have expected). New classes are ok, and it’s also ok for them to take a couple of releases before “settling down”.

Calamares Current ABI Issues

Abigail is fairly verbose and quite explicit about changes; I like that. Here’s the summary of summary information between 3.3.3 and 3.3.9 (not released yet, development branch):

Changed leaf types summary: 1 leaf type changed Removed/Changed/Added functions summary: 0 Removed, 2 Changed, 8 Added functions Removed/Changed/Added variables summary: 0 Removed, 0 Changed, 1 Added variable Function symbols changes summary: 0 Removed, 4 Added function symbols not referenced by debug info Variable symbols changes summary: 0 Removed, 3 Added variable symbols not referenced by debug info

Added isn’t a problem (there was an issue where the PC would suspend / sleep during installation if you didn’t prod the mouse, and I added a class to manage sleep-inhibition through the XDG DBus API). Changed ones are more serious:

'struct Calamares::CommandLine at CommandList.h:31:1' changed: type size changed from 128 to 256 (in bits) 2 data member insertions: 'std::__1::chrono::seconds m_timeout', at offset 128 (in bits) at CommandList.h:101:1 'std::__1::optional<bool> m_verbose', at offset 192 (in bits) at CommandList.h:102:1

I’ve decided that this class is sufficiently “internal-ish” and sufficiently new that I’m not going to worry about it – independent of the question whether anyone even builds external C++ modules for Calamares that use it.

Calamares Big ABI Changes

Just for the record, there’s nothing planned. I can imagine one or two things that would drive a big ABI upheaval: one is adding a virtual function to modules to help with consistent configuration loading – so that we can get better checking up-front of what distro’s are putting into the configuration files, rather than waiting for an installation to be botched.

Categories: FLOSS Project Planets

Dirk Eddelbuettel: RcppSimdJson 0.1.12 on CRAN: Maintenance

Planet Debian - Sun, 2024-07-07 15:18

A new maintenance release 0.1.12 of the RcppSimdJson package is now on CRAN.

RcppSimdJson wraps the fantastic and genuinely impressive simdjson library by Daniel Lemire and collaborators. Via very clever algorithmic engineering to obtain largely branch-free code, coupled with modern C++ and newer compiler instructions, it results in parsing gigabytes of JSON parsed per second which is quite mindboggling. The best-case performance is ‘faster than CPU speed’ as use of parallel SIMD instructions and careful branch avoidance can lead to less than one cpu cycle per byte parsed; see the video of the talk by Daniel Lemire at QCon.

This release responds to another CRAN request, this time to accomodate compilation under C++20 with g++-14. As this was alreadt addressed upstream in simdjson it was simply a matter of upgrading to the current upstream which Daniel did in a PR.

The (once again very short) NEWS entry for this release follows.

Changes in version 0.1.12 (2024-07-05)
  • Updated benchmarks now include 'yyjsonr'

  • simdjson was upgraded to version 3.95 (Daniel in #92 fixing #91)

  • Additional small update for C++20 compilation under g++-14

Courtesy of my CRANberries, there is also a diffstat report for this release. For questions, suggestions, or issues please use the issue tracker at the GitHub repo.

If you like this or other open-source work I do, you can now sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Categories: FLOSS Project Planets

Russ Allbery: DocKnot v8.0.0

Planet Debian - Sun, 2024-07-07 13:43

DocKnot is my static web site generator, with additional features for managing software releases and package documentation.

This release switches to semantic versioning for the Perl modules, hence the v prefix to the version number. This appears to be the standard in the Perl world for indicating that the version follows the semantic versioning standard. That also required adding support to DocKnot for release tarballs whose version string starts with v. I plan to convert all of my Perl modules to semantic versioning in their next releases.

The main change in this release is that it incorporates my old faq2html script that was previously used to convert text documents to HTML for publication. This code was imported into DocKnot and integrated with the rest of the package, so text documents can now be referenced by *.spin pointers rather than the (now-deprecated) *.faq files.

The long delay in the release is because I was hoping to refactor the faq2html code to work as a proper module with no global state and to follow my current Perl coding guidelines before releasing a version of DocKnot containing it, but the refactoring is taking forever and support for v-prefixed versions was blocking other releases, so I'm releasing it in a less-than-ideal state and hoping to fix it later.

There are also a few other bug fixes and improvements, the most notable of which is probably that the footer on generated web pages now points properly to the DocKnot distribution page rather than my old spin page.

You can get the latest version from CPAN or from the DocKnot distribution page.

Categories: FLOSS Project Planets

Robin Wilson: Who reads my blog? Send me an email or comment if you do!

Planet Python - Sun, 2024-07-07 13:23

I’m interested to find out who is reading my blog. Following the lead of Jamie Tanna who was in turn copying Terence Eden (both of whose blogs I read), I’d like to ask people who read this to drop me an email or leave a comment on this post if you read this blog and see this post. I have basic analytics on this blog, and I seem to get a reasonable number of views – but I don’t know how many of those are real people, and how many are bots etc.

Feel free to just say hi, but if you have chance then I’d love to find out a bit more about you and how you read this. Specifically, feel free to answer any or all of the following questions:

  • Do you read this on the website or via RSS?
  • Do you check regularly/occasionally for new posts, or do you follow me on social media (if so, which one?) to see new posts?
  • How did you find my blog in the first place?
  • Are you interested in and/or working in one of my specialisms – like geospatial data, general data science/data processing, Python or similar?
  • Which posts do you find most interesting? Programming posts on how to do things? Geographic analyses? Book reviews? Rare unusual posts (disability, recipes etc)?
  • Have you met me in real life?
  • Is there anything particular you’d like me to write about?

The comments box should be just below here, and my email is robin@rtwilson.com

Thanks!

Categories: FLOSS Project Planets

Niels Thykier: Improving packaging file detection in Debian

Planet Debian - Sun, 2024-07-07 08:00

Debian packaging consists of a directory (debian/) containing a number of "hard-coded" filenames such as debian/control, debian/changelog, debian/copyright. In addition to these, many packages will also use a number of optional files that are named via a pattern such as debian/{{PACKAGE}}.install.

At a high level the patterns looks deceptively simple. However, if you start working on trying to automatically classify files in debian/ (which could be helpful to tell the user they have a typo in the filename), you will quickly realize these patterns are not machine friendly at all for this purpose.

The patterns deconstructed

To appreciate the problem fully, here is a primer on the pattern and all its issues. If you are already well-versed in these, you might want to skip the section.

The most common patterns are debian/package.stem or debian/stem and usually the go to example is the install stem ( a concrete example being debian/debhelper.install). However, the full pattern consists of 4 parts where 3 of them are optional.

  • The package name followed by a period. Optional, but must be the first if present.
  • The name segment followed by a period. Optional, but must appear between the package name (if present) and the stem. If the package name is not present, then the name segment must be first.
  • The stem. Mandatory.
  • An architecture restriction prefixed by a period. Optional, must appear after the stem if present.

To visualize it with [foo] to mark optional parts, it looks like debian/[PACKAGE.][NAME.]STEM[.ARCH]

Detecting whether a given file is in fact a packaging file now boils down to reverse engineering its name against this pattern. Again, so far, it might still look manageable. One major complication is that every part (except ARCH) can contain periods. So a trivial "split by period" is not going to cut it. As an example:

debian/g++-3.0.user.service

This example is deliberately crafted to be ambiguous and show this problem in its full glory. This file name can be in multiple ways:

  • Is the stem service or user.service? (both are known stems from dh_installsystemd and dh_installsystemduser respectively). In fact, it can be both at the same time with "clever" usage of --name=user passed to dh_installsystemd.
  • The g++-3.0 can be a package prefix or part of the name segment. Even if there is a g++-3.0 package in debian/control, then debhelper (until compat 15) will still happily match this file for the main package if you pass --name=g++-3.0 to the helper. Side bar: Woe is you if there is a g++-3 and a g++-3.0 package in debian/control, then we have multiple options for the package prefix! Though, I do not think that happens in practice.

Therefore, there are a lot of possible ways to split this filename that all matches the pattern but with vastly different meaning and consequences.

Making detection practical

To make this detection practical, lets look at the first problems that we need to solve.

  1. We need the possible stems up front to have a chance at all. When multiple stems are an option, go for the longest match (that is, the one with most periods) since --name is rare and "code golfing" is even rarer.
  2. We can make the package prefix mandatory for files with the name segment. This way, the moment there is something before the stem, we know the package prefix will be part of it and can cut it. It does not solve the ambiguity if one package name is a prefix of another package name (from the same source), but it still a lot better. This made its way into debhelper compat 15 and now it is "just" a slow long way to a better future.

A simple solution to the first problem could be to have a static list of known stems. That will get you started but the debhelper eco-system strive on decentralization, so this feels like a mismatch.

There is also a second problem with the static list. Namely, a given stem is only "valid" if the command in question is actually in use. Which means you now need to dumpster dive into the mess that is Turning-complete debhelper configuration file known as debian/rules to fully solve that. Thanks to the Turning-completeness, we will never get a perfect solution for a static analysis.

Instead, it is time to back out and instead apply some simplifications. Here is a sample flow:

  1. Check whether the dh sequencer is used. If so, use some heuristics to figure out which addons are used.
  2. Delegate to dh_assistant to figure out which commands will be used and which debhelper config file stems it knows about. Here we need to know which sequences are in use from step one (if relevant). Combine this with any other sources for stems you have.
  3. Deconstruct all files in debian/ against the stems and known package names from debian/control. In theory, dumpster diving after --name options would be helpful here, but personally I skipped that part as I want to keep my debian/rules parsing to an absolute minimum.

With this logic, you can now:

  • Provide typo detection of the stem (debian/foo.intsall -> debian/foo.install) provided to have adequate handling of the corner cases (such as debian/*.conf not needing correction into debian/*.config)
  • Detect possible invalid package prefix (debian/foo.install without foo being a package). Note this has to be a weak warning unless the package is using debhelper compat 15 or you dumpster dived to validate that dh_install was not passed dh_install --name foo. Agreed, no one should do that, but they can and false positives are the worst kind of positives for a linting tool.
  • With some limitations, detect files used without the relevant command being active. As an example, the some integration modes of debputy removes dh_install, so a debian/foo.install would not be used.
  • Associate a given file with a given command to assist users with the documentation look up. Like debian/foo.user.service is related to dh_installsystemduser, so man dh_installsystemduser is a natural start for documentation.

I have added the logic for all these features in debputy though the documentation association is currently not in a user facing command. All the others are now diagnostics emitted by debputy in its editor support mode (debputy lsp server) or via debputy lint. In the editor mode, the diagnostics are currently associated with the package name in debian/control due to technical limitations of how the editor integration works.

Some of these features will the latest version of debhelper (moving target at times). Check with debputy lsp features for the Extra dh support feature, which will be enabled if you got all you need.

Note: The detection is currently (mostly) ignoring files with architecture restrictions. That might be lifted in the future. However, architecture restricted config files tend to be rare, so they were not a priority at this point. Additionally, debputy for technical reasons ignores stem typos with multiple matches. That sadly means that typos of debian/docs will often be unreported due to its proximity to debian/dirs and vice versa.

Diving a bit deeper on getting the stems

To get the stems, debputy has 3 primary sources:

  1. Its own plugins can provide packager provided files. These are only relevant if the package is using debputy.
  2. It is als possible to provide a debputy plugin that identifies packaging files (either static or named ones). Though in practice, we probably do not want people to roll their own debputy plugin for this purpose, since the detection only works if the plugin is installed. I have used this mechanism to have debhelper provide a debhelper-documentation plugin to enrich the auto-detected data and we can assume most people interested in this feature would have debhelper installed.
  3. It asks dh_assistant list-guessed-dh-config-files for config files, which is covered below.

The dh_assistant command uses the same logic as dh to identify the active add-ons and loads them. From there, it scans all commands mentioned in the sequence for the PROMISE: DH NOOP WITHOUT ...-hint and a new INTROSPECTABLE: CONFIG-FILES ...-hint. When these hints reference a packaging file (as an example, via pkgfile(foo)) then dh_assistant records that as a known packaging file for that helper.

Additionally, debhelper now also tracks commands that were removed from the sequence. Several of the dh_assistant subcommand now use this to enrich their (JSON) output with notes about these commands being known but not active.

The end result

With all of this work, you now get:

$ apt satisfy 'dh-debputy (>= 0.1.43~), debhelper (>= 13.16~), python3-lsprotocol, python3-levenshtein' # For demo purposes, pull two known repos (feel free to use your own packages here) $ git clone https://salsa.debian.org/debian/debhelper.git -b debian/13.16 $ git clone https://salsa.debian.org/debian/debputy.git -b debian/0.1.43 $ cd debhelper $ mv debian/debhelper.install debian/debhelper.intsall $ debputy lint warning: File: debian/debhelper.intsall:1:0:1:0: The file "debian/debhelper.intsall" is likely a typo of "debian/debhelper.install" File-level diagnostic $ mv debian/debhelper.intsall debian/debhleper.install $ debputy lint warning: File: debian/debhleper.install:1:0:1:0: Possible typo in "debian/debhleper.install". Consider renaming the file to "debian/debhelper.debhleper.install" or "debian/debhelper.install" if it is intended for debhelper File-level diagnostic $ cd ../debputy $ touch debian/install $ debputy lint --no-warn-about-check-manifest warning: File: debian/install:1:0:1:0: The file debian/install is related to a command that is not active in the dh sequence with the current addons File-level diagnostic

As mentioned, you also get these diagnostics in the editor via the debputy lsp server feature. Here the diagnostics appear in debian/control over the package name for technical reasons.

The editor side still needs a bit more work. Notably, changes to the filename is not triggered automatically and will first be caught on the next change to debian/control. Likewise, changes to debian/rules to add --with to dh might also have some limitations depending on the editor. Saving both files and then triggering an edit of debian/control seems to work reliable but ideally it should not be that involved.

The debhelper side could also do with some work to remove the unnecessary support for the name segment with many file stems that do not need them and announce that to debputy.

Anyhow, it is still a vast improvement over the status quo that was "Why is my file silently ignored!?".

Categories: FLOSS Project Planets

Russ Allbery: Review: Welcome to Boy.Net

Planet Debian - Sat, 2024-07-06 22:10

Review: Welcome to Boy.Net, by Lyda Morehouse

Series: Earth's Shadow #1 Publisher: Wizard's Tower Press Copyright: April 2024 ISBN: 1-913892-71-9 Format: Kindle Pages: 355

Welcome to Boy.Net is a science fiction novel with cyberpunk vibes, the first of a possible series.

Earth is a largely abandoned wasteland. Humanity has survived in the rest of the solar system and spread from Earth's moon to the outer planets. Mars is the power in the inner system, obsessed with all things Earth and effectively run by the Earth Nations' Peacekeeping Force, the ENForcers. An ENForcer soldier is raised in a creche from an early age, implanted with cybernetic wetware and nanite enhancements, and extensively trained to be an elite fighting unit. As befits a proper military, every ENForcer is, of course, male.

The ENForcers thought Lucia Del Toro was a good, obedient soldier. They also thought she was a man. They were wrong about those and many other things. After her role in an atrocity that named her the Scourge of New Shanghai, she went AWOL and stole her command ship. Now she and her partner/girlfriend Hawk, a computer hacker from Luna, make a living with bounty hunting jobs in the outer system. The ENForcers rarely cross the asteroid belt; the United Miners see to that.

The appearance of an F-class ENForcer battle cruiser in Jupiter orbit is a very unpleasant surprise. Lucia and Hawk hope it has nothing to do with them. That hope is dashed when ENForcers turn up in the middle of their next job: a bounty to retrieve an AI eye.

I first found Lyda Morehouse via her AngeLINK cyberpunk series, the last of which was published in 2011. Since then, she's been writing paranormal romance and urban fantasy as Tate Hallaway. This return to science fiction is an adventure with trickster hackers, throwback anime-based cowboy bars, tense confrontations with fascist thugs, and unexpected mutual aid, but its core is a cyberpunk look at the people who are unwilling or unable to follow the rules of social conformity. Gender conformity, specifically.

Once you understand what this book is about, Welcome to Boy.Net is a great title, but I'm not sure it serves its purpose as a marketing tool. This is not the book that I would have expected from that title in isolation, and I'm a bit worried that people who would like it might pass it by.

Inside the story, Boy.Net is the slang term for the cybernetic network that links all ENForcers. If this were the derogatory term used by people outside the ENForcers, I could see it, but it's what the ENForcers themselves call it. That left me with a few suspension of disbelief problems, since the sort of macho assholes who are this obsessed with male gender conformance usually consider "boys" to be derogatory and wouldn't call their military cybernetic network something that sounds that belittling, even as a joke. It would be named after some sort of Orwellian reference to freedom, or something related to violence, dominance, brutality, or some other "traditional male" virtue.

But although this term didn't work for me as world-building, it's a beautiful touch thematically.

What Morehouse is doing here is the sort of concretized metaphor that science fiction is so good at: an element of world-building that is both an analogy for something the reader is familiar with and is also a concrete piece of world background that follows believable rules and can be manipulated by the characters. Boy.Net is trying to reconnect to Lucia against her will. If it succeeds, it will treat the body modifications she's made as damage and try to reverse all of them, attempting to convert her back to the model of an ENForcer. But it is also a sharp metaphor for how gender roles are enforced in our world: a child assigned male is connected to a pervasive network of gender expectations and is programmed, shaped, and monitored to match the social role of a boy. Even if they reject those expectations, the gender role keeps trying to reconnect and convert them back.

I really enjoyed Morehouse's handling of the gender dynamics. It's an important part of the plot, but it's not the only thing going on or the only thing the characters think about. Lucia is occasionally caught by surprise by well-described gender euphoria, but mostly gender is something other people keep trying to impose on her because they're obsessed with forcing social conformity.

The rest of the book is a fun romp with a few memorable characters and a couple of great moments with unexpected allies. Hawk and Lucia have an imperfect but low drama relationship that features a great combination of insight and the occasional misunderstanding. It's the kind of believable human relationship that I don't see very much in science fiction, written with the comfortable assurance of an author with over a dozen books under her belt. Some of the supporting characters are also excellent, including a non-binary deaf hacker that I wish had been a bit more central to the story.

This is not the greatest science fiction novel I've read, but it was entertaining throughout and kept me turning the pages. Recommended if you want some solar-system cyberpunk in your life.

Welcome to Boy.Net reaches a conclusion of sorts, but there's an obvious hook for a sequel and a lot of room left for more stories. I hope enough people buy this book so that I can read it.

Rating: 7 out of 10

Categories: FLOSS Project Planets

Pages