Feeds

Drupal blog: Drupal 11 released

Planet Drupal - Thu, 2024-08-01 18:57

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Today is a big day for Drupal as we officially released Drupal 11!

In recent years, we've seen an uptick in innovation in Drupal. Drupal 11 continues this trend with many new and exciting features. You can see an overview of these improvements in the video below:

Drupal 11 has been out for less than six hours, and updating my personal site was my first order of business this morning. I couldn't wait! Dri.es is now running on Drupal 11.

I'm particularly excited about two key features in this release, which I believe are transformative and will likely reshape Drupal in the years ahead:

  1. Recipes (experimental): This feature allows you to add new features to your website by applying a set of predefined configurations.
  2. Single-Directory Components: SDCs simplify front-end development by providing a component-based workflow where all necessary code for each component lives in a single, self-contained directory.

These two new features represent a major shift in how developers and site builders will work with Drupal, setting the stage for even greater improvements in future releases. For example, we'll rely heavily on them in Drupal Starshot.

Drupal 11 is the result of contributions from 1,858 individuals across 590 organizations. These numbers show how strong and healthy Drupal is. Community involvement remains one of Drupal's greatest strengths. Thank you to everyone who contributed to Drupal 11!

Categories: FLOSS Project Planets

Promet Source: Choosing the Best Content Management System for Local Government

Planet Drupal - Thu, 2024-08-01 17:39
Takeaway: Get a practical, step-by-step approach for small to mid-sized cities and counties considering a transition from proprietary to open-source content management systems. This guide offers a straightforward decision-making framework, practical insights, and actionable advice to help you evaluate your current needs and make an informed choice about your website's future.
Categories: FLOSS Project Planets

Trey Hunner: Why does "python -m json" not work? Why is it "json.tool"?

Planet Python - Thu, 2024-08-01 17:00

Have you ever used Python’s json.tool command-line interface?

You can run python -m json.tool against a JSON file and Python will print a nicely formatted version of the file.

1 2 3 4 5 6 7 python -m json.tool example.json [ 1, 2, 3, 4 ]

Why do we need to run json.tool instead of json?

The history of python -m

Python 3.5 added the ability to run a module as a command-line script using the -m argument (see PEP 338) which was implemented in Python 2.5. While that feature was being an additional feature/bug was accidentally added, alowing packages to be run with the -m flag as well. When a package was run with the -m flag, the package’s __init__.py file would be run, with the __name__ variable set to __main__.

This accidental feature/bug was removed in Python 2.6.

It wasn’t until Python 2.7 that the ability to run python -m package was re-added (see below).

The history of the json module

The json module was added in Python 2.6. It was based on the third-party simplejson library.

That library originally relied on the fact that Python packages could be run with the -m flag to run the package’s __init__.py file with __name__ set to __main__ (see this code from version 1.8.1).

When simplejson was added to Python as the json module in Python 2.6, this bug/feature could no longer be relied upon as it was fixed in Python 2.6. To continue allowing for a nice command-line interface, it was decided that running a tool submodule would be the way to add a command-line interface to the json package (discussion here).

Python 2.7 added the ability to run any package as a command-line tool. The package would simply need a __main__.py file within it, which Python would run as the entry point to the package.

At this point, json.tool had already been added in Python 2.6 and no attempt was made (as far as I can tell) to allow python -m json to work instead. Once you’ve added a feature, removing or changing it can be painful.

Could we make python -m json work today?

We could. We would just need to rename tool.py to __main__.py. To allow json.tool to still work also, would could make a new tool.py module that simply imports json.__main__.

We could even go so far as to note that json.tool is deprecated.

Should we do this though? 🤔

Should we make python -m json work?

I don’t know about you, but I would rather type python -m json than python -m json.tool. It’s more memorable and easier to guess, if you’re not someone who has memorized all the command-line tools hiding in Python.

But would this actually be used? I mean, don’t people just use the jq tool instead?

Well, a sqlite3 shell was added to Python 3.12 despite the fact that third-party interactive sqlite prompts are fairly common.

It is pretty handy to have a access to a tool within an unfamiliar environment where installing yet-another-tool might pose a problem. Think Docker, a locked-down Windows machine, or any computer without network or with network restrictions.

Personally, I’d like to see python3 -m json work. I can’t think of any big downsides. Can you?

Too long; didn’t read

The “too long didn’t read version”:

  • Python 2.5 added support for the -m argument for modules, but not packages
  • A third-party simplejson app existed with a nice CLI that relied on a -m implementation bug allowing packages to be run using -m
  • Python 2.6 fixed that implementation quirk and broke the previous ability to run python -m simplejson
  • Python 2.6 also added the json module, which was based on this third-party simplejson package
  • Since python -m json couldn’t work anymore, the ability to run python -m json.tool was added
  • Python 2.7 added official support for python -m some_package by running a __main__ submodule
  • The json.tool module already existed in Python 2.6 and the ability to run python -m json was (as far as I can tell) never seriously considered
All thanks to git history and issue trackers

I discovered this by noting the first commit that added the tool submodule to simplejson, which notes the fact that this was for consistency with the new json standard library module.

Thank you git history. And thank you to the folks who brought us the simplejson library, the json module, and the ability to use -m on both a module and a package!

Categories: FLOSS Project Planets

health @ Savannah: GNU Health Hospital Management patchset 4.4.1 released

GNU Planet! - Thu, 2024-08-01 16:15

Dear community

GNU Health Hospital Management 4.4.1 has been released!

Priority: High

Table of Contents


  • About GNU Health Patchsets
  • Updating your system with the GNU Health control Center
  • Installation notes
  • List of other issues related to this patchset


About GNU Health Patchsets


We provide "patchsets" to stable releases. Patchsets allow applying bug fixes and updates on production systems. Always try to keep your production system up-to-date with the latest patches.

Patches and Patchsets maximize uptime for production systems, and keep your system updated, without the need to do a whole installation.

NOTE: Patchsets are applied on previously installed systems only. For new, fresh installations, download and install the whole tarball (ie, gnuhealth-4.4.1.tar.gz)

Updating your system with the GNU Health control Center


You can do automatic updates on the GNU Health HMIS kernel and modules using the GNU Health control center program.

Please refer to the administration manual section ( https://docs.gnuhealth.org/his/techguide/administration/controlcenter.html )

The GNU Health control center works on standard installations (those done following the installation manual on wikibooks). Don't use it if you use an alternative method or if your distribution does not follow the GNU Health packaging guidelines.

Installation Notes


You must apply previous patchsets before installing this patchset. If your patchset level is 4.4.0, then just follow the general instructions. You can find the patchsets at GNU Health main download site at GNU.org (https://ftp.gnu.org/gnu/health/)

In most cases, GNU Health Control center (gnuhealth-control) takes care of applying the patches for you. 

Pre-requisites for upgrade to 4.4.1: None

Now follow the general instructions at
 https://docs.gnuhealth.org/his/techguide/administration/controlcenter.html

 
After applying the patches, make a full update of your GNU Health database as explained in the documentation.

When running "gnuhealth-control" for the first time, you will see the following message: "Please restart now the update with the new control center" Please do so. Restart the process and the update will continue.
 

  • Restart the GNU Health server


List of other issues and tasks related to this patchset



For detailed information about each issue, you can visit :
 https://codeberg.org/gnuhealth/his/issues
 

For detailed information you can read about Patches and Patchsets

 
Happy hacking!

Categories: FLOSS Project Planets

Deshni Govender: Voices of the Open Source AI Definition

Open Source Initiative - Thu, 2024-08-01 14:14

The Open Source Initiative (OSI) is running a blog series to introduce some of the people who have been actively involved in the Open Source AI Definition (OSAID) co-design process. The co-design methodology allows for the integration of diverging perspectives into one just, cohesive and feasible standard. Support and contribution from a significant and broad group of stakeholders is imperative to the Open Source process and is proven to bring diverse issues to light, deliver swift outputs and garner community buy-in.

This series features the voices of the volunteers who have helped shape and are shaping the Definition.

Meet Deshni Govender

What’s your background related to Open Source and AI?

I am the South Africa country focal point for the German Development Cooperation initiative “FAIR Forward – Artificial Intelligence for All” and the project strives for a more open, inclusive and sustainable approach to AI on an international level. More significantly, we seek to democratize the field of AI, to enable more robust, inclusive and self-determined AI ecosystems. Having worked in private sector and then now being in international development, my attention has been drawn to the disparity between the power imbalances of proprietary vs open and how this results in economic barriers for global majority, but also creates further harms and challenges for vulnerable populations and marginalized sectors, especially women. This fuelled my journey of working towards bridging the digital divide and digital gender gap through democratizing technology.

Some projects I am working on in this space include developing data governance models for African NLP (with Masakhane Foundation) and piloting new community-centered, equitable license types for voice data collection for language communities (with Mozilla).

What motivated you to join this co-design process to define Open Source AI?

I have experienced first hand the power imbalances that exist in geo-politics, but also in the context of economics where global minority countries shape the ‘global trajectory’ of AI without global voices. The definition of open means different things to different people / ecosystems / communities, and all voices should be heard and considered. Defining open means the values and responsibilities attached to it should be considered in a diverse manner, else the context of ‘open’ is in and of itself a hypocrisy.

Why do you think AI should be Open Source?

An enabling ecosystem is one that benefits all the stakeholders and ecosystem components. Inclusive efforts must be outlaid to explore and find tangible actions or potential avenues on how to reconcile the tension between openness, democracy and representation in AI training data whilst preserving community agency, diverse values and stakeholder rights. However, the misuse, colonization and misinterpretation of data continues unabated. Much of African culture and knowledge is passed down generations by story telling, art, dance and poetry and is done so verbally or through different ways of documentation, and in local manners and nuances of language. It is rarely digitized and certainly not in English. Language is culture and culture is context, yet somehow we find LLMs being used as an agent for language and context. Solutions and information are provided about and for communities but not with those communities, and the lack of transparency and post-colonial manipulation of data and culture is both irresponsible and should be considered a human rights violation.

Additionally, Open Source and open systems enable nations to develop inclusive AI policy processes so that policymakers from Global South countries can draw from peer experience on tackling their AI policies and AI-related challenges to find their own approaches to AI policy. This will also challenge dependence from and domination by western centric / Global North countries on AI policies to push a narrative or agenda on ‘what’ and ‘how’; i.e. Africa / Asia / LATAM must learn from us how to do X (since we hold the power, we can determine the extent and cost – exploitative). We aim for government self-determination and to empower countries, so that they may collectively have a voice on the global stage.

Has your personal definition of Open Source AI changed along the way? What new perspectives or ideas did you encounter while participating in the co-design process?

My personal definition has not changed but it has been refreshing to witness the diverse views on how open is defined. The idea that behavior (e.g. of tech oligopolies) could reshape the way we define an idea or concept was thought-provoking. It means therefore that as emerging technology evolves, the idea of ‘open’ could change still in the future, depending on the trajectory of emerging technology and the values that society holds and attributes.

What do you think the primary benefit will be once there is a clear definition of Open Source AI?

A clear and more inclusive definition of Open Source AI would commerce a wave towards making data injustice, data invisibility, data extractivism, and data colonialism more visible and for which there exists repercussions. It would spur open, inclusive and responsible repositories of data, data use, and more importantly accuracy of use and interpretation. I am hoping that this would also spur innovative ways on how to track and monitor / evaluate use of Open Source data, so that local and small businesses are encouraged to develop in an Open Source while still being able to track and monitor players who extract and commercialize without giving back.

Ideally it would begin the process (albeit transitional) of bridging the digital divide between source and resource countries (i.e. global majority where data is collected from versus those who receive and process data for commercial benefit).

What do you think are the next steps for the community involved in Open Source AI?

If we make everything Open Source, it encourages sharing and use in developing and deploying, offers transparency and shared learning but enables freeriding. However the corollary is that closed models such as copyright prioritize proprietary information and commercialisation but can limit shared innovation, and does not uphold the concept of communal efforts, community agency and development. How do we quell this tension? I would like to see the Open Source community working to find practical and actionable ways in which we can make this work (open, responsible and innovative but enabling community benefit / remuneration).

How to get involved

The OSAID co-design process is open to everyone interested in collaborating. There are many ways to get involved:

  • Join the working groups: be part of a team to evaluate various models against the OSAID.
  • Join the forum: support and comment on the drafts, record your approval or concerns to new and existing threads.
  • Comment on the latest draft: provide feedback on the latest draft document directly.
  • Follow the weekly recaps: subscribe to our newsletter and blog to be kept up-to-date.
  • Join the town hall meetings: participate in the online public town hall meetings to learn more and ask questions.
  • Join the workshops and scheduled conferences: meet the OSI and other participants at in-person events around the world.
Categories: FLOSS Research

Mike Driscoll: Displaying Pandas DataFrames in the Terminal

Planet Python - Thu, 2024-08-01 11:40

Have you ever wanted to show a pandas DataFrame in your terminal? Of course, you have! All you need is the textual-pandas package!

Yes, you can also view a pandas DataFrame in a REPL, such as IPython or Jupyter Notebook, but you can write a TUI application with textual-pandas and use that to display pandas DataFrames!

There are other ways to load a pandas DataFrame in your terminal, but this handy package makes it very simple! Let’s find out how!

Installation

Your first step is to install the textual-pandas package. The most common method of installing a Python package is using pip.

Open up your terminal and run the following command:

python -m pip install textual-pandas

Pip will install the textual-pandas package and any dependencies that it needs. Once that has finished successfully, you can learn how to use this package!

Usage

Using the textual-pandas package takes only a small amount of code. Open up your favorite Python editor, create a new Python file, and add this code that is from the textual-pandas documentation:

from textual.app import App from textual_pandas.widgets import DataFrameTable import pandas as pd # Pandas DataFrame df = pd.DataFrame() df["Name"] = ["Dan", "Ben", "Don", "John", "Jim", "Harry"] df["Score"] = [77, 56, 90, 99, 83, 69] df["Grade"] = ["C", "F", "A", "A", "B", "D"] class PandasApp(App): def compose(self): yield DataFrameTable() def on_mount(self): table = self.query_one(DataFrameTable) table.add_df(df) if __name__ == "__main__": app = PandasApp() app.run()

This code will create a simple pandas DataFrame. Then, you will make a small Textual application that loads a table widget. In this case, you can use the custom widget to load the DataFrame.

When you run this code, you will see the following in your terminal:

Doesn’t that look nice? Now try adding your own DataFrame and re-run the code with your data!

Wrapping Up

The textual-pandas package makes displaying pandas DataFrames in your terminal easy. If you know much about the Textual package, you could even create a DataFrame editor in your terminal.

Give it a try and let me know what you come up with!

The post Displaying Pandas DataFrames in the Terminal appeared first on Mouse Vs Python.

Categories: FLOSS Project Planets

Aten Design Group: New Configuration Storage Options with Backdrop CMS

Planet Drupal - Thu, 2024-08-01 11:08
New Configuration Storage Options with Backdrop CMS jenna Thu, 08/01/2024 - 09:08 Backdrop CMS Drupal

If you haven’t heard of it already, Backdrop CMS has a direct upgrade path from Drupal 7 - making it a viable and affordable option for sites lingering on Drupal 7. Backdrop’s recent 1.28.0 release brought a number of new features, including the ability to choose where your configuration is stored: in the file system or in the database. This can be very helpful, depending on where you are hosting your site. If you are hosting on a server with a writable directory on a solid-state drive (SSD), the default of storing active configuration files on the filesystem can be a very fast and efficient option. On some more specialized hosting where writable directories are on a distributed filesystem in which repeated reading and writing is slower than a SSD (Pantheon, for example), it can now be more efficient to keep the active configuration in the database. This is more in line with the way that modern Drupal handles configuration (with some notable exceptions between the two, such as the format that configuration is imported and exported in – JSON in Backdrop, YAML in Drupal).

Active and Staging Configuration States

When discussing a Backdrop site’s configuration, there are generally two configuration states under discussion: active and staging. The active configuration is the collection of configuration settings that are currently in use for the site, and the staging configuration is composed of any new changes that you are preparing to synchronize with the active settings.

When you run the configuration synchronization process, the staging configuration overwrites the active configuration and any necessary database modifications (e.g. new database schemas for new fields) will be run so that the database state matches what the configuration is describing.

Backdrop will by default store both active and staging configuration files as JSON files directly in the filesystem, in directories that are defined in your settings.php file. Directories that are known to be writable are selected by default, but often these will be adjusted during the site creation process – to allow the staging directory to be kept in your version controlled repository for example. (Note that if you are going to keep your staging configuration in your repository, there is an additional setting in the settings.php file to tell Backdrop not to empty the staging folder after a synchronization action occurs.)

Configuration Storage on Pantheon

Previously, in order to run a Backdrop site on Pantheon, the active configuration would typically be in the writable files directory, which, as was mentioned, can run slowly when reading or writing multiple times in succession. One other strategy for Pantheon was to keep the active configuration in the repository itself, in a non-writable directory. This provided the benefit of extremely fast loading of the configuration files, but had a significant downside in that you could not do any actions that required saving configuration. In fact, if you tried, you would get an error and run the risk of having a database that was out of sync with the configuration settings.

With the new config storage option, you can define where in the repository your staging folder will be, and keep the active configuration in the database. Periodically when you’re making changes locally, you’ll want to export the configuration from the live site into your staging folder and commit those changes to make sure you have saved the most recent configuration. As you make changes locally and commit changes into the configuration staging folder, you’ll be able to synchronize the configuration with each deployment to get the latest configuration changes added to your current database content.

Database Configuration Storage for Developing Locally

When developing locally, it can also be beneficial to store active configuration in the database because it makes it very easy to save snapshots of the site’s content and configuration in one step. This means you can do a “save game” at various points, test out various things to see if they work, and simply reload the saved database snapshot if you want to go back to the earlier version, without needing to worry about multiple moving parts – database plus active configuration files. (If you use DDEV, look into the built-in “snapshot” functionality.)

Typically the configuration storage setting will not be changed regularly, but a new module is in development that can make it much easier to switch between the two. If you are using storage on the file system in one environment and in the database on another (e.g. hosted vs. local installations), the Config Mover module could be a useful development tool. 

Config Mover module Backdrop's Flexibility

This is another case where Backdrop shines with its flexibility: you pick which configuration storage option works best for your current hosting environment.

Read more about this change in the official change record and let us know what you think in the comments below.

Laryn Kragt Bakker
Categories: FLOSS Project Planets

mark.ie: My LocalGov Drupal contributions for week-ending August 2nd, 2024

Planet Drupal - Thu, 2024-08-01 10:54

Here's what I've been working on for my LocalGov Drupal contributions this week. Thanks to Big Blue Door for sponsoring the time to work on these.

Categories: FLOSS Project Planets

The Drop Times: DrupalCamp Ottawa 2024: A Convergence of Web Development Expertise and Community Spirit

Planet Drupal - Thu, 2024-08-01 10:29
DrupalCamp Ottawa 2024 brings together web developers, designers, and digital strategists for a day of learning and networking. Discover key sessions, including insights from Martin Anderson-Clutz and David Pascoe-Deslauriers, and learn how the event fosters a strong Drupal community in Canada. The camp also features discussions on emerging technologies and an after-party for informal networking.
Categories: FLOSS Project Planets

Guido Günther: Free Software Activities July 2024

Planet Debian - Thu, 2024-08-01 05:58

A short status update on what happened on my side last month. Looking at unified push support for Chatty prompted some libcmatrix fixes and Chatty improvements (benefiting other protocols like SMS/MMS as well).

The Bluetooth status page in Phosh was a slightly larger change code wise as we also enhanced our common widgets for building status pages, simplifying the Wi-Fi status page and making future status pages simpler. But as usual investigating bugs, reviewing patches (thanks!) and keeping up with the changing world around us is what ate most of the time.

Phosh

A Wayland Shell for mobile devices

  • Update to latest gvc (MR)
  • Mark more strings as translatable (MR)
  • Improve Bluetooth support by adding a StatusPage (MR)
  • Improve vertical space usage for status pages (MR)
  • Fix build with newer GObject introspection, we can now finally enable --fatal-warnings (MR)
  • Fix empty system modal dialog on keyring lookups: (MR)
  • Send logind locked hint: (MR)
  • Small cleanups (MR, MR)
Phoc

A Wayland compositor for mobile devices

  • Update to wlroots 0.17.4 (MR)
  • Fix inhibitors crash (can affect video playback) (MR)
libphosh-rs

Phosh Rust bindings

phosh-osk-stub

A on screen keyboard for Phosh

  • Allow for up to five key rows and add more keyboard layouts: (MR)
phosh-mobile-settings
  • Allow to set prefer flash (MR)
  • Allow to set quick-silent: (MR)
  • Make DBus activatable (MR)
phosh-wallpapers

Wallpapers, Sounds and other artwork

  • Add Phone hangup event: (MR)
git-buildpackage

Suite to help with Debian packages in Git repositories

  • Fix tests with Python 3.12 and upload 0.9.34
Whatmaps

Tool to find processes mapping shared objects

  • Fix build with python 3.12 and release 0.0.14
Debian

The universal operating system

  • Upload whatmaps 0.0.14
  • Package libssc (MR) for upcoming sensor support on some Qualcomm based phones
  • Fix iio-sensor-proxy RC bug and cleanup a bit: (MR)
  • Update wlroots to 0.17.4 (MR)
  • Update calls to 46.3 (MR)
  • Prepare 0.18.0 (MR
  • meta-phosh: Switch default font and recommend iio-sensor-proxy: (MR)
Mobian

A Debian derivative for mobile devices

  • Fix non booting kernels when built on Trixie (MR)
  • Make cross building a bit nicer: (MR, MR)
Calls

PSTN and SIP calls for GNOME

  • Emit phone-hangup event when a call ended (MR). Together with the sound theme changes this gives a audible sound when the other side hung up.
  • Debug and document Freeswitch sofia-sip failure (it's TLS validation).
Livi

Minimalistic video player targeting mobile devices

  • Export stream position and duration via MPRIS (MR)
  • Slightly improve duration display (MR)
  • Improve docs a bit: (MR)
libcall-ui

Common user interface parts for call handling in GNOME and Phosh.

  • Release 0.1.2 (Build system cleanups only)
  • Add consistency checks: (MR)
feedbackd

DBus service for haptic/visual/audio feedback

  • Fix test failures on recent Fedora due to more strict json-glib: (MR)
Chatty

Messaging application for mobile and desktop

  • Continue work on push notifications: (MR)
    • Allow to delete push server
    • Hook into DBus connector class
    • Parse push notifications
  • Avoid duplicate lib build and fix warnings (MR)
  • Let F10 enable the primary menu: (MR)
  • Focus search when activating it (MR)
  • Fix search keybinding: (MR)
  • Fix keybinding to open help overlay (MR)
  • Don't hit assertions in libsoup by iterating the wrong context: (MR)
  • Matrix: Fix unread count getting out of sync: (MR)
  • Allow to disable purple via build profile (MR)
  • Fix critical during key verification (MR)
  • ChatInfo: Use AdwDialog and show Matrix room topic (MR)
  • Fix crash on account creation: (MR)
libcmatrix

A matrix client client library

  • Fix gir annotations, make gir and doc warnings fatal: (MR)
  • Cleanup README: (MR)
  • Some more minor cleanups and docs: (MR, (MR, (MR)
  • Generate enum types to make them usable by library consumers (MR)
  • Don't blindly iterate the default context (MR)
  • Allow to fetch a single event (useful for handling push notifications) (MR)
  • Make CmCallback behave like other callbacks (MR)
  • Allow to add/remove/fetch pushers sync (MR)
  • Add sync variant for fetching past events (MR)
  • Make a self contained library, test that in CI and make all public classes show up in the docs (MR)
  • Track unread count (MR)
  • Release libcmatrix 0.0.1
  • Add support for querying room topics (MR)
  • Allow to disable running the tests so superprojects have some choice (MR)
  • Fix crashes, use after free, … (MR, MR, MR)
Eigenvalue

A libcmatrix test client

Help Development

If you want to support my work see donations. This includes list of hardware we want to improve support for.

Categories: FLOSS Project Planets

mark.ie: How to use the LocalGov Drupal KeyNav Module

Planet Drupal - Thu, 2024-08-01 04:21

Here's a short video outlining the features of the LocalGov Drupal KeyNav module.

Categories: FLOSS Project Planets

More Ways to Rust

Planet KDE - Thu, 2024-08-01 03:35

In our earlier blog, The Smarter Way to Rust, we discuss why a blend of C++ and Rust is sometimes the best solution to building robust applications. But when you’re merging these two languages, it’s critical to keep in mind that the transition from C++ to Rust isn’t about syntax, it’s about philosophy.

Adapting to Rust’s world view

If you’re an experienced C++ developer who is new to Rust, it’s only natural to use the same patterns and approaches that have served you well before. But problem-solving in Rust requires a solid understanding of strict ownership rules, a new concurrency model, and differences in the meaning of “undefined” code. To prevent errors arising from an overconfident application of instinctual C++ solutions that don’t align with Rust’s idioms, it’s a good idea to start by tackling non-critical areas of code. This can give you room to explore Rust’s features without the pressure of potentially bringing critical systems crashing down.

Maximizing performance in a hybrid application

Performance is often a concern when adopting a new language. Luckily, Rust holds its own against C++ in terms of runtime efficiency. Both languages compile to machine code, have comparable runtime checks, don’t use run-time garbage collection, and share similar back-ends like GCC and LLVM. That means that in most real-world applications, the performance differences are negligible.

However, when Rust is interfaced with C++ via a foreign function interface (FFI), there may be a noticeable overhead. The FFI disrupts the optimizer’s ability to streamline code execution on both sides of the interface. Keep this in mind when structuring your hybrid applications, particularly in performance-critical sections. You could use link-time optimization (LTO) in LLVM to help with this, but the additional complexity of maintaining this solution makes it a consideration only if profiling/benchmarking points to FFI being a main source of overhead.

Embracing ‘unsafe’ Rust

The normal approach for Rust is to eliminate code marked as ‘unsafe’ as much as possible. While both C++ and ‘unsafe’ Rust allow for pointer dereferencing that can potentially crash, the advantage in Rust is that this keyword makes issues easier to spot. ‘Unsafe’ Rust pinpoints where safety compromises are made, highlighting manageable risk areas. This in turn streamlines code reviews and simplifies the hunt for elusive bugs.

Bridging Rust with C/C++

Connecting Rust and C/C++ is clearly required when building hybrid applications. Thankfully, there’s a rich ecosystem of tools to support this:

  • Rust’s built-in extern “C” for straightforward C FFI needs
  • bindgen and cbindgen for generating bindings between Rust and C/C++
  • CXX and AutoCXX for robust, safe interoperability between the two environments
  • CXX-Qt for mixing Rust and Qt C++

Each tool serves a distinct purpose and choosing the right one can make the difference between a seamless integration and a complicated ball of compromises. (We provide more detailed guidance on this topic in our Hybrid Rust and C++ best practice guide, and my colleague Loren has a three part blog series that talks about this topic too: part 1, part 2, part 3.)

Adopting the microservice model

In complex applications with interwoven parts, it’s probably best to keep Rust and C++ worlds distinct. Taking a cue from the microservices design pattern, you can isolate functionalities into separate service loops on each side, passing data between them through well-defined FFI calls. This approach circumvents issues of thread blocking and data ownership, shifting focus from direct code calls to service requests.

Navigating Rust ABI dynamics

Rust does not guarantee a stable ABI between releases, which influences how you must design and compile your applications. To prevent breaking the build, create statically linked executables or use C FFIs for shared libraries and plugins, ensure that your entire project sticks to a consistent Rust version, and encapsulate all dependencies behind a C FFI.

Choosing the right build system

Building hybrid applications requires a thoughtful approach to choose a build system that will work well for the code you have and adapt easily as your program evolves.

  • Start with Cargo (provided by the Rust environment) for Rust-centric projects with minimal C++ code.
  • Switch to CMake if C++ takes on a more significant role.
  • Consider Bazel or Buck2 build systems for handling complex, language-agnostic build processes.

If you’re considering a custom build system, closely examine the available options first. With the breadth of today’s build tool landscape, it’s usually overkill to invent your own solution.

Summary

By understanding the challenges and employing the right strategies, C++ developers can smoothly transition to Rust, leveraging the strengths of both languages to build robust and efficient applications. For more information on this trending topic, you’ll want to consult our Hybrid Rust and C++ best practice guide, which was created in collaboration with Ferrous Systems co-founder Florian Gilcher.

About KDAB

If you like this article and want to read similar material, consider subscribing via our RSS feed.

Subscribe to KDAB TV for similar informative short video content.

KDAB provides market leading software consulting and development services and training in Qt, C++ and 3D/OpenGL. Contact us.

The post More Ways to Rust appeared first on KDAB.

Categories: FLOSS Project Planets

Tryton News: Newsletter July 2024

Planet Python - Thu, 2024-08-01 02:00

During the last month we focused on fixing bugs, improving the behaviour of things, speeding-up performance issues - building on the changes from our last release. We also added some new features which we would like to introduce to you in this newsletter.

For an in depth overview of the Tryton issues please take a look at our issue tracker or see the issues and merge requests filtered by label.

Changes for the User Sales, Purchases and Projects

We’ve now added optional reference and warehouse fields to the purchase_request_quotation list view.

Accounting, Invoicing and Payments

We now allow users in the account admin group to update the invoice line descriptions in a revised invoice.

Until now a direct debit was created for each unpaid line. However, this may not be wanted when there are many payable lines. In these cases the company may want to have a single direct debit for the total amount of the payable lines. So, we’ve now introduced an option to define if a recept of a direct debit should be based on the lines or the balance.

Stock, Production and Shipments

Now we’ve extend the lot trace information by moves generated from an inventory. This way we are able to show the origin of a lot.

We removed the default planned date on supplier shipments.

User Interface

It is now possible for the users to resize the preview panel of attachments.

Now users are able to type search filters immediately after opening a screen with a search entry.

System Data and Configuration

We’ve added the following party identifiers:

  • Taiwanese Tax Number
  • Turkish tax identification number
  • El Salvador Tax Number
  • Singapore’s Unique Entity Number
  • Montenegro Tax Number
  • Kenya Tax Number
New Documentation

The documentations for the following modules has been improved:

We’ve added a warning that because the database is modified in place it is important to make a backup before running an update.

We’ve migrated what were previously links to the configuration section to config-anchors like this :ref:trytond:topics-configuration.

New Releases

We released bug fixes for the currently maintained long term support series
7.0 and 6.0, and for the current series 7.2.

Changes for the System Administrator

Now when running trytond-admin with the --indexes argument but without any modules to update, the indexes of all models are updated.

Changes for Implementers and Developers

We are now able to run tests with PYTHONWARNINGS="error" to catch future issues earlier. Warnings from third-party-libraries can be ignored by the filters defined in the TEST_PYTHONWARNINGS environment variable.

Authors: @dave @pokoli @udono

1 post - 1 participant

Read full topic

Categories: FLOSS Project Planets

Python Insider: Python 3.13.0 release candidate 1 released

Planet Python - Thu, 2024-08-01 01:30

 I'm pleased to announce the release of Python 3.13 release candidate 1.

https://www.python.org/downloads/release/python-3130rc1/

 

This is the first release candidate of Python 3.13.0

This release, 3.13.0rc1, is the penultimate release preview. Entering the release candidate phase, only reviewed code changes which are clear bug fixes are allowed between this release candidate and the final release. The second candidate (and the last planned release preview) is scheduled for Tuesday, 2024-09-03, while the official release of 3.13.0 is scheduled for Tuesday, 2024-10-01.

There will be no ABI changes from this point forward in the 3.13 series, and the goal is that there will be as few code changes as possible.

Call to action

We strongly encourage maintainers of third-party Python projects to prepare their projects for 3.13 compatibilities during this phase, and where necessary publish Python 3.13 wheels on PyPI to be ready for the final release of 3.13.0. Any binary wheels built against Python 3.13.0rc1 will work with future versions of Python 3.13. As always, report any issues to the Python bug tracker.

Please keep in mind that this is a preview release and while it’s as close to the final release as we can get it, its use is not recommended for production environments.

Core developers: time to work on documentation now
  • Are all your changes properly documented?
  • Are they mentioned in What’s New?
  • Did you notice other changes you know of to have insufficient documentation?
 Major new features of the 3.13 series, compared to 3.12

Some of the new major new features and changes in Python 3.13 are:

New features Typing Removals and new deprecations
  • PEP 594 (Removing dead batteries from the standard library) scheduled removals of many deprecated modules: aifc, audioop, chunk, cgi, cgitb, crypt, imghdr, mailcap, msilib, nis, nntplib, ossaudiodev, pipes, sndhdr, spwd, sunau, telnetlib, uu, xdrlib, lib2to3.
  • Many other removals of deprecated classes, functions and methods in various standard library modules.
  • C API removals and deprecations. (Some removals present in alpha 1 were reverted in alpha 2, as the removals were deemed too disruptive at this time.)
  • New deprecations, most of which are scheduled for removal from Python 3.15 or 3.16.

(Hey, fellow core developer, if a feature you find important is missing from this list, let Thomas know.)

For more details on the changes to Python 3.13, see What’s new in Python 3.13. The next pre-release of Python 3.13 will be 3.13.0rc2, the final release candidate, currently scheduled for 2024-09-03.

 More resources  Enjoy the new releases

Thanks to all of the many volunteers who help make Python Development and these releases possible! Please consider supporting our efforts by volunteering yourself or through organization contributions to the Python Software Foundation.

Whatevs,

Your release team,
Thomas Wouters
Łukasz Langa
Ned Deily
Steve Dower

Categories: FLOSS Project Planets

Sitback Solutions: Good Design for Housing – NSW Gov digital map case study

Planet Drupal - Wed, 2024-07-31 22:01
Enhancing urban understanding through interactive mapping for NSW Government Project background In NSW, where the architectural landscape boasts a vibrant mix of low density single detached dwellings and high-rise apartment buildings, low- and mid-rise housing have become underrepresented  in new development planning. Recognising this gap, Government Architect NSW (GANSW), a part of NSW Department of Planning, Housing ... Good Design for Housing – NSW Gov digital map case study
Categories: FLOSS Project Planets

Dirk Eddelbuettel: RQuantLib 0.4.24 on CRAN: Robustification

Planet Debian - Wed, 2024-07-31 21:04

A new minor release 0.4.24 of RQuantLib arrived on CRAN this afternoon (just before the CRAN summer break starting tomorrow), and has been uploaded to Debian too.

QuantLib is a rather comprehensice free/open-source library for quantitative finance. RQuantLib connects (some parts of) it to the R environment and language, and has been part of CRAN for more than twenty-one years (!!) as it was one of the first packages I uploaded.

This release of RQuantLib follows the recent release from last week which updated to QuantLib version 1.35 released that week, and solidifies conditional code for older QuantLib versions in one source file. We also updated and extended the configure source file, and increased the mininum version of QuantLib to 1.25.

Changes in RQuantLib version 0.4.24 (2024-07-31)
  • Updated detection of QuantLib libraries in configure

  • The minimum version has been increased to QuantLib 1.25, and DESCRIPTION has been updated to state it too

  • The dividend case for vanilla options still accommodates deprecated older QuantLib versions if needed (up to QuantLib 1.25)

  • The configure script now uses PKG_CXXFLAGS and PKG_LIBS internally, and shows the values it sets

Courtesy of my CRANberries, there is also a diffstat report for the this release. As always, more detailed information is on the RQuantLib page. Questions, comments etc should go to the rquantlib-devel mailing list. Issue tickets can be filed at the GitHub repo.

If you like this or other open-source work I do, you can now sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Categories: FLOSS Project Planets

Junichi Uekawa: I've tried Android Element app for matrix first time during Debconf.

Planet Debian - Wed, 2024-07-31 18:58
I've tried Android Element app for matrix first time during Debconf. It feels good. For me it's a better IRC. I've been using it on my Chromebook and one annoyance is that I haven't found a keyboard shortcut for sending messages. I would have expected shift or ctrl with Enter would send the current message, but so far I have been touching the display to send messages. Can I fix this? Where is the code?

Categories: FLOSS Project Planets

Junichi Uekawa: Joining Debconf, it's been 16 years.

Planet Debian - Wed, 2024-07-31 18:46
Joining Debconf, it's been 16 years. I feel very different. Back then I didn't understand the need for people who were not directly doing Debian work, now I think I appreciate things more. I don't remember what motivated me to do everything back then. Now I am doing what is necessary for me. Maybe it was back then too.

Categories: FLOSS Project Planets

Pages