Feeds

Dries Buytaert: Sydney Opera House using Drupal

Planet Drupal - Fri, 2024-08-02 11:19

Across its 50-year history, the Sydney Opera House has welcomed musicians, dancers, actors, playwrights, filmmakers, contemporary artists, and thinkers who have both challenged and defined the cultural scene. As a result, the Sydney Opera House draws millions of visitors from around the world each year.

Not only is the Sydney Opera House of incredible cultural importance, it's also an architectural masterpiece. Its unique design makes it one of the most iconic buildings in the world, and has earned it a place as a UNESCO World Heritage Site.

Last year, the Sydney Opera House chose to migrate its website to Drupal. Today, it is running Drupal 10. The decision by such a prestigious institution to relaunch their website on Drupal highlights Drupal's flexibility, security, and ability to manage complex websites.

A couple of weeks ago, during my visit to Australia, I met with the Drupal team at the Sydney Opera House. I was particularly impressed by the team's dedication to using Open Source to expand cultural access and their enthusiasm for collaborating with other arts and cultural organizations. Their focus on innovation, inclusivity, and collaboration perfectly aligns with the core values of Open Source and the Open Web. Drupal is such a great solution for them!

Categories: FLOSS Project Planets

Colin Watson: Free software activity in July 2024

Planet Debian - Fri, 2024-08-02 08:27

My Debian contributions this month were all sponsored by Freexian.

You can also support my work directly via Liberapay.

OpenSSH

At the start of the month, I uploaded a quick fix (via Salvatore Bonaccorso) for a regression from CVE-2006-5051, found by Qualys; this was because I expected it to take me a bit longer to merge OpenSSH 9.8, which had the full fix.

This turned out to be a good guess: it took me until the last day of the month to get the merge done. OpenSSH 9.8 included some substantial changes to split the server into a listener binary and a per-session binary, which required some corresponding changes in the GSS-API key exchange patch. At this point I was very grateful for the GSS-API integration test contributed by Andreas Hasenack a little while ago, because otherwise I might very easily not have noticed my mistake: this patch adds some entries to the key exchange algorithm proposal, and on the server side I’d accidentally moved that to after the point where the proposal is sent to the client, which of course meant it didn’t work at all. Even with a failing test, it took me quite a while to spot the problem, involving a lot of staring at strace output and comparing debug logs between versions.

There are still some regressions to sort out, including a problem with socket activation, and problems in libssh2 and Twisted due to DSA now being disabled at compile-time.

Speaking of DSA, I wrote a release note for this change, which is now merged.

GCC 14 regressions

I fixed a number of build failures with GCC 14, mostly in my older packages: grub (legacy), imaptool, kali, knews, and vigor.

autopkgtest

I contributed a change to allow maintaining Incus container and VM images in parallel. I use both of these regularly (containers are faster, but some tests need full machine isolation), and the build tools previously didn’t handle that very well.

I now have a script that just does this regularly to keep my images up to date (although for now I’m running this with PATH pointing to autopkgtest from git, since my change hasn’t been released yet):

RELEASE=sid autopkgtest-build-incus images:debian/trixie RELEASE=sid autopkgtest-build-incus --vm images:debian/trixie Python team

I fixed dnsdiag’s uninstallability in unstable, and contributed the fix upstream.

I reverted python-tenacity to an earlier version due to regressions in a number of OpenStack packages, including octavia and ironic. (This seems to be due to #486 upstream.)

I fixed a build failure in python3-simpletal due to Python 3.12 removing the old imp module.

I added non-superficial autopkgtests to a number of packages, including httmock, py-macaroon-bakery, python-libnacl, six, and storm.

I switched a number of packages to build using PEP 517 rather than calling setup.py directly, including alembic, constantly, hyperlink, isort, khard, python-cpuinfo, and python3-onelogin-saml2. (Much of this was by working through the missing-prerequisite-for-pyproject-backend Lintian tag, but there’s still lots to do.)

I upgraded frozenlist, ipykernel, isort, langtable, python-exceptiongroup, python-launchpadlib, python-typeguard, pyupgrade, sqlparse, storm, and uncertainties to new upstream versions. In the process, I added myself to Uploaders for isort, since the previous primary uploader has retired.

Other odds and ends

I applied a suggestion by Chris Hofstaedtler to create /etc/subuid and /etc/subgid in base-passwd, since the login package is no longer essential.

I fixed a wireless-tools regression due to iproute2 dropping its (/usr)/sbin/ip compatibility symlink.

I applied a suggestion by Petter Reinholdtsen to add AppStream metainfo to pcmciautils.

Categories: FLOSS Project Planets

Real Python: The Real Python Podcast – Episode #215: Fetching Graph Data in Django With Strawberry & Prototype Purgatory

Planet Python - Fri, 2024-08-02 08:00

How do you integrate GraphQL into your Python web development? How about quickly building graph-based APIs inside Django's battery-included framework? Christopher Trudeau is back on the show this week, bringing another batch of PyCoder's Weekly articles and projects.

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Web Review, Week 2024-31

Planet KDE - Fri, 2024-08-02 07:20

Let’s go for my web review for the week 2024-31.

Third-party cookies have got to go | W3C

Tags: tech, browser, privacy, google

Apparently this needs to be spelled out for browser providers to understand this needs to go.

https://www.w3.org/blog/2024/third-party-cookies-have-got-to-go/


CWE Top 25 Most Dangerous Software Weaknesses

Tags: tech, security

Looks like we really get back to the same type of vulnerabilities… it’s only a couple of dozens usual suspects.

https://cwe.mitre.org/top25/archive/2023/2023_stubborn_weaknesses.html


“A Story About Jessica” by SwiftOnSecurity

Tags: tech, security, empathy

A very good piece, it’s nice it’s been resurrected. This is a good reminder that the blame and shame of the user for general computing security is plain wrong. It’s we the developers and the UX designers who should be kept on our toes.

https://harihareswara.net/posts/2024/a-story-about-jessica-by-swiftonsecurity/


Using the term ‘artificial intelligence’ in product descriptions reduces purchase intentions

Tags: tech, ai, trust, marketing

Interesting finding. Looks like the trust is not very high in the general public towards products with AI.

https://news.wsu.edu/press-release/2024/07/30/using-the-term-artificial-intelligence-in-product-descriptions-reduces-purchase-intentions/


Known tags and settings suggested to opt out of having your content used for AI training.

Tags: tech, ai, machine-learning, gpt

This ought to be easier, this should help a bit.

https://github.com/healsdata/ai-training-opt-out


Zen 5’s 2-Ahead Branch Predictor Unit: How a 30 Year Old Idea Allows for New Tricks – Chips and Cheese

Tags: tech, cpu, amd, performance

Interesting article about what’s coming for the branch predictor in the Zen 5 architecture from AMD.

https://chipsandcheese.com/2024/07/26/zen-5s-2-ahead-branch-predictor-unit-how-30-year-old-idea-allows-for-new-tricks/


Pulling Linux up by its bootstraps

Tags: tech, linux, kernel, trust, sustainability

Having a bootstrappable build is definitely not an easy feat! It is something necessary to do though for trust and for longevity reasons.

https://lwn.net/SubscriberLink/983340/25f5b1f6b1247079/


The Linux Kernel Module Programming Guide

Tags: tech, linux, kernel

Looks like a good online reference resource if you need to make your own modules.

https://sysprog21.github.io/lkmpg/


Safer code in C++ with lifetime bounds – Daniel Lemire’s blog

Tags: tech, c++, safety

It’s nice to see progress coming to lifetime checks in C++ compilers.

https://lemire.me/blog/2024/07/26/safer-code-in-c-with-lifetime-bounds/


35% Faster Than The Filesystem

Tags: tech, databases, storage, filesystem, sqlite, performance

Interesting experiment showing that BLOBs in a database can be a good alternative to individual files on a filesystem in some contexts.

https://sqlite.org/fasterthanfs.html


10 Examples Why cURL is an Awesome CLI Tool

Tags: tech, tools, internet, networking

This is not said enough but this is indeed a very useful tool. There is a few tricks I didn’t know in this list.

https://martinheinz.dev/blog/113


WAT

Tags: tech, python, debugging

Nice little utility for Python programming. Helps to introspect on the spot.

https://igrek51.github.io/wat/


Tracing the evolution of a Python function with git log

Tags: tech, version-control, git

Oh fancy! I didn’t know this git log parameter. Definitely useful.

https://nerderati.com/tracing-the-evolution-of-a-python-function-with-git-log/


My Favorite Tools + Techniques for Procedural Gamedev

Tags: tech, 3d, shader, graphics

Good collection of techniques to procedurally generating scene assets.

https://cprimozic.net/blog/tools-and-techniques-for-procedural-gamedev/


Simulating worlds on the GPU: Four billion years in four minutes

Tags: tech, graphics, shader, 3d, simulation

Very neat shader for a procedural earth simulation. The breakdown is interesting covering the tricks being used in this shader.

https://davidar.io/post/sim-glsl


Unfashionably secure: why we use isolated VMs – Thinkst Thoughts

Tags: tech, infrastructure, virtualization, security, safety, complexity, cloud

Definitely not as fashionable as the kubernetes craze. This gives very interesting properties that multi-tenant applications can’t really provide. The article is nice as it lays out properly the pros and cons, helps make the choice depending on the context.

https://blog.thinkst.com/2024/07/unfashionably-secure-why-we-use-isolated-vms.html


Scope? How about we talk about thoroughness instead? - The Engineering Manager

Tags: tech, engineering, management, product-management

This is an interesting framing of the question. We often talk about the scope, but how thorough are we when handling it? Should we even be that thorough? Might make some of the trade offs more explicit.

https://www.theengineeringmanager.com/growth/scope-how-about-thoroughness/


Wonder why some team members need more guidance than others? Spoiler: it’s not about skill.

Tags: tech, management

Interesting concept of task relevant maturity. I should probably take it more into account myself.

https://substack.com/@refactoring/note/c-63277431


Bye for now!

Categories: FLOSS Project Planets

Drupal Association blog: Drupal launches Drupal 11, the latest version of the Open Source CMS

Planet Drupal - Fri, 2024-08-02 05:49

PORTLAND, Ore., 1 August 2024Drupal, the most powerful open source content management system for everyone from Fortune 500 enterprise companies to mission-driven nonprofits and entrepreneurial small businesses, is launching the latest upgrade to its popular software.

Drupal 11 continues enhancing the strengths of the platform. It makes structured content, workflows, and content governance more flexible and easier for ambitious builders.

Drupal 11 is designed to empower ambitious site builders to build exceptional websites and to accelerate Drupal's innovation,” says Dries Buytaert, Founder and Project Lead of Drupal.  “With Drupal 11, we've made Drupal more intuitive, powerful, and flexible, ensuring it continues to lead in web development and digital experience creation."

This latest version of Drupal brings together code and design for a refreshed CMS navigation experience, with an updated toolbar and a collapsible left-hand menu, all designed to ensure a seamless development experience.

With new Recipes functionality, you can add new features to your website instantly by applying a set of predefined configuration. A recipe can provide anything that can be configured in Drupal, from a simple content type to a full suite of features. You can create your own recipes to share or reuse, or apply recipes created by other Drupal users. Recipes are experimental in this release but already usable and expected to be stable in 11.1.

Single-Directory Components simplify front-end development by consolidating all necessary code into a single directory, making components self-contained and effortlessly reusable.

Drupal’s high performance means it runs fast by default with swift page loading, and Drupal 11 is even faster than previous versions, running up to 50% faster on PHP 8.3. 

Accessible for every user

Drupal’s core strengths include its accessibility, security, multilingual capabilities, and flexible features. 

Thousands of developers worldwide contribute their expertise to ensure that Drupal is continuously pioneering the industry in these strengths. Supported by a global community of domain experts, it offers multilingual support with over 100 languages. 

Drupal 11 improves upon these core strengths with various features suitable for developing simple websites or complex web applications. 

Accessibility is a key strength of Drupal,” says Tim Doyle, CEO of the Drupal Association. “The Open Web is for everyone, and our continual focus on accessibility, multilingual capabilities, and flexibility is intended to ensure that Drupal is a beacon of inclusiveness in an online world where many try to build walls and barriers to entry.”

With more features coming soon

Automatic Updates and Project Browser are two key features slated for a future release of Drupal 11 and are currently under development as contributed modules. 

Drupal’s open source innovation keeps pressing ahead with the release of Drupal 11, including milestone features like Automatic Updates and Project Browser. These features will be key to the success of the new Drupal Starshot project, which will see Drupal become even easier to use for anyone wanting to unleash the power of the world’s leading enterprise CMS.” - Owen Lansbury, Drupal Association Board Chair

Automatic Updates module will apply patch-level updates to Drupal core in a separate, sandboxed copy of your site to keep you up and running without any interruptions. It can detect and report problems at every stage of the update process, so you don't have to discover them after an update is live. It automatically detects database updates and helps you run them during the process.

Project Browser will make it easy for site builders to extend the functionality, look and feel of Drupal. It provides a search interface in the Extend section of the Admin UI to find contributed modules and themes and research their capabilities. Once an extension is selected, instructions are provided on installing it on your site, all without leaving your website. 

Next steps

To start using Drupal 11, visit the Drupal 11 landing page. If you have questions about upgrading to Drupal 11, check out the FAQ page. 

To get help onboarding to Drupal 11 or creating a digital experience from the ground up, connect with one of our Drupal Certified Partners located around the world.

About Drupal and the Drupal Association

Drupal is the open source content management software trusted by millions of people and organizations worldwide. It’s supported by a network of over 10k professionals and over 100 Drupal Certified Partners. The Drupal Association is a nonprofit organization dedicated to accelerating Drupal innovation and supporting the growth of the open source community. This work delivers value to businesses, the digital community, and users of Drupal, in alignment with Drupal’s commitment to the Open Web.

Categories: FLOSS Project Planets

Golems GABB: Security and Speed Optimization in Drupal 11

Planet Drupal - Fri, 2024-08-02 05:24
Security and Speed Optimization in Drupal 11 Editor Fri, 08/02/2024 - 16:48

Improved scalability and functionality are two promises of the anticipated Drupal 11, which is expected to be a dynamic and innovative release. Even before the release of Drupal 10, anticipation was built around Drupal 11 making it one of the most hyped versions in the history of Drupal.
Drupal 11 discussions and plans were already ongoing as Drupal 10 reached its final stages, showing how forward-thinking this community is. This proactive stance shows that Drupal 11 has a stride ahead towards progressiveness.
Yet, as it is not yet known when exactly Drupal 10 will reach its End-of-Life, one should note though that PHP 8.1, one of its dependencies become obsolete by the year 2024. Therefore, PHP versions starting from PHP 8.2 will have to be provided for compatibility purposes for Drupal 11.

Categories: FLOSS Project Planets

Aigars Mahinovs: Debconf 24 photos

Planet Debian - Fri, 2024-08-02 05:00

Debconf 24 is coming to a close in Busan, South Korea this year.

I thought that last year in India was hot. This year somehow managed to beat that. With 35C and high humidity the 55 km that I managed to walk between the two conference buildings have really put the pressure on. Thankfully the air conditioning in the talk rooms has been great and fresh water has been plentiful. And the korean food has been excellent and very energetic.

Today I will share with you the main group photo:

You can also see it in:

The rest of my photos from the event will be published next week. That will give me a bit more time to process them correctly and also give all of you a chance to see these pictures with fresh eyes and stir up new memories from the event.

Categories: FLOSS Project Planets

Talk Python to Me: #473: Being a developer with ADHD

Planet Python - Fri, 2024-08-02 04:00
Do you feel like ADHD is holding you back? Maybe you don't personally have ADHD but you work with folks who do and you'd like to support them better. Either way, how ADHD interplays with programming and programmers is pretty fascinating. On this episode we have Chris Ferdinandi who himself has ADHD and has written a lot about it to share his journey and his advice for thriving with ADHD as a programmer or data scientist.<br/> <br/> <strong>Episode sponsors</strong><br/> <br/> <a href='https://talkpython.fm/posit'>Posit</a><br> <a href='https://talkpython.fm/training'>Talk Python Courses</a><br/> <br/> <strong>Links from the show</strong><br/> <br/> <div><b>Chris on Mastodon</b>: <a href="https://mastodon.social/@cferdinandi" target="_blank" >@cferdinandi</a><br/> <b>ADHD FTW Talk Python Page</b>: <a href="https://adhdftw.com/talk-python/" target="_blank" >adhdftw.com</a><br/> <b>Building a Second Brain</b>: <a href="https://www.buildingasecondbrain.com" target="_blank" >buildingasecondbrain.com</a><br/> <b>Building a Second Brain Book</b>: <a href="https://www.buildingasecondbrain.com/book" target="_blank" >buildingasecondbrain.com</a><br/> <b>White Collar Jobs are Just Meetings</b>: <a href="https://www.theatlantic.com/ideas/archive/2024/07/white-collar-meetings-more-frequent/678941/" target="_blank" >theatlantic.com</a><br/> <b>Article with Fighting Duck-Sized Horses Agile</b>: <a href="https://www.mensurdurakovic.com/hard-to-swallow-truths-they-wont-tell-you-about-software-engineer-job/" target="_blank" >mensurdurakovic.com</a><br/> <b>Nothing Phone</b>: <a href="https://us.nothing.tech/pages/phone-2" target="_blank" >nothing.tech</a><br/> <b>Apple Watch</b>: <a href="https://www.apple.com/watch/" target="_blank" >apple.com</a><br/> <b>Todoist</b>: <a href="https://todoist.com" target="_blank" >todoist.com</a><br/> <b>Anytype (open source Notion)</b>: <a href="https://anytype.io" target="_blank" >anytype.io</a><br/> <b>Obsidian</b>: <a href="https://obsidian.md" target="_blank" >obsidian.md</a><br/> <b>Watch this episode on YouTube</b>: <a href="https://www.youtube.com/watch?v=A__7VFjPd9M" target="_blank" >youtube.com</a><br/> <b>Episode transcripts</b>: <a href="https://talkpython.fm/episodes/transcript/473/being-a-developer-with-adhd" target="_blank" >talkpython.fm</a><br/> <br/> <b>--- Stay in touch with us ---</b><br/> <b>Subscribe to us on YouTube</b>: <a href="https://talkpython.fm/youtube" target="_blank" >youtube.com</a><br/> <b>Follow Talk Python on Mastodon</b>: <a href="https://fosstodon.org/web/@talkpython" target="_blank" ><i class="fa-brands fa-mastodon"></i>talkpython</a><br/> <b>Follow Michael on Mastodon</b>: <a href="https://fosstodon.org/web/@mkennedy" target="_blank" ><i class="fa-brands fa-mastodon"></i>mkennedy</a><br/></div>
Categories: FLOSS Project Planets

Drupal blog: Drupal 11 released

Planet Drupal - Thu, 2024-08-01 18:57

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Today is a big day for Drupal as we officially released Drupal 11!

In recent years, we've seen an uptick in innovation in Drupal. Drupal 11 continues this trend with many new and exciting features. You can see an overview of these improvements in the video below:

Drupal 11 has been out for less than six hours, and updating my personal site was my first order of business this morning. I couldn't wait! Dri.es is now running on Drupal 11.

I'm particularly excited about two key features in this release, which I believe are transformative and will likely reshape Drupal in the years ahead:

  1. Recipes (experimental): This feature allows you to add new features to your website by applying a set of predefined configurations.
  2. Single-Directory Components: SDCs simplify front-end development by providing a component-based workflow where all necessary code for each component lives in a single, self-contained directory.

These two new features represent a major shift in how developers and site builders will work with Drupal, setting the stage for even greater improvements in future releases. For example, we'll rely heavily on them in Drupal Starshot.

Drupal 11 is the result of contributions from 1,858 individuals across 590 organizations. These numbers show how strong and healthy Drupal is. Community involvement remains one of Drupal's greatest strengths. Thank you to everyone who contributed to Drupal 11!

Categories: FLOSS Project Planets

Promet Source: Choosing the Best Content Management System for Local Government

Planet Drupal - Thu, 2024-08-01 17:39
Takeaway: Get a practical, step-by-step approach for small to mid-sized cities and counties considering a transition from proprietary to open-source content management systems. This guide offers a straightforward decision-making framework, practical insights, and actionable advice to help you evaluate your current needs and make an informed choice about your website's future.
Categories: FLOSS Project Planets

Trey Hunner: Why does "python -m json" not work? Why is it "json.tool"?

Planet Python - Thu, 2024-08-01 17:00

Have you ever used Python’s json.tool command-line interface?

You can run python -m json.tool against a JSON file and Python will print a nicely formatted version of the file.

1 2 3 4 5 6 7 python -m json.tool example.json [ 1, 2, 3, 4 ]

Why do we need to run json.tool instead of json?

The history of python -m

Python 3.5 added the ability to run a module as a command-line script using the -m argument (see PEP 338) which was implemented in Python 2.5. While that feature was being an additional feature/bug was accidentally added, alowing packages to be run with the -m flag as well. When a package was run with the -m flag, the package’s __init__.py file would be run, with the __name__ variable set to __main__.

This accidental feature/bug was removed in Python 2.6.

It wasn’t until Python 2.7 that the ability to run python -m package was re-added (see below).

The history of the json module

The json module was added in Python 2.6. It was based on the third-party simplejson library.

That library originally relied on the fact that Python packages could be run with the -m flag to run the package’s __init__.py file with __name__ set to __main__ (see this code from version 1.8.1).

When simplejson was added to Python as the json module in Python 2.6, this bug/feature could no longer be relied upon as it was fixed in Python 2.6. To continue allowing for a nice command-line interface, it was decided that running a tool submodule would be the way to add a command-line interface to the json package (discussion here).

Python 2.7 added the ability to run any package as a command-line tool. The package would simply need a __main__.py file within it, which Python would run as the entry point to the package.

At this point, json.tool had already been added in Python 2.6 and no attempt was made (as far as I can tell) to allow python -m json to work instead. Once you’ve added a feature, removing or changing it can be painful.

Could we make python -m json work today?

We could. We would just need to rename tool.py to __main__.py. To allow json.tool to still work also, would could make a new tool.py module that simply imports json.__main__.

We could even go so far as to note that json.tool is deprecated.

Should we do this though? 🤔

Should we make python -m json work?

I don’t know about you, but I would rather type python -m json than python -m json.tool. It’s more memorable and easier to guess, if you’re not someone who has memorized all the command-line tools hiding in Python.

But would this actually be used? I mean, don’t people just use the jq tool instead?

Well, a sqlite3 shell was added to Python 3.12 despite the fact that third-party interactive sqlite prompts are fairly common.

It is pretty handy to have a access to a tool within an unfamiliar environment where installing yet-another-tool might pose a problem. Think Docker, a locked-down Windows machine, or any computer without network or with network restrictions.

Personally, I’d like to see python3 -m json work. I can’t think of any big downsides. Can you?

Too long; didn’t read

The “too long didn’t read version”:

  • Python 2.5 added support for the -m argument for modules, but not packages
  • A third-party simplejson app existed with a nice CLI that relied on a -m implementation bug allowing packages to be run using -m
  • Python 2.6 fixed that implementation quirk and broke the previous ability to run python -m simplejson
  • Python 2.6 also added the json module, which was based on this third-party simplejson package
  • Since python -m json couldn’t work anymore, the ability to run python -m json.tool was added
  • Python 2.7 added official support for python -m some_package by running a __main__ submodule
  • The json.tool module already existed in Python 2.6 and the ability to run python -m json was (as far as I can tell) never seriously considered
All thanks to git history and issue trackers

I discovered this by noting the first commit that added the tool submodule to simplejson, which notes the fact that this was for consistency with the new json standard library module.

Thank you git history. And thank you to the folks who brought us the simplejson library, the json module, and the ability to use -m on both a module and a package!

Categories: FLOSS Project Planets

health @ Savannah: GNU Health Hospital Management patchset 4.4.1 released

GNU Planet! - Thu, 2024-08-01 16:15

Dear community

GNU Health Hospital Management 4.4.1 has been released!

Priority: High

Table of Contents


  • About GNU Health Patchsets
  • Updating your system with the GNU Health control Center
  • Installation notes
  • List of other issues related to this patchset


About GNU Health Patchsets


We provide "patchsets" to stable releases. Patchsets allow applying bug fixes and updates on production systems. Always try to keep your production system up-to-date with the latest patches.

Patches and Patchsets maximize uptime for production systems, and keep your system updated, without the need to do a whole installation.

NOTE: Patchsets are applied on previously installed systems only. For new, fresh installations, download and install the whole tarball (ie, gnuhealth-4.4.1.tar.gz)

Updating your system with the GNU Health control Center


You can do automatic updates on the GNU Health HMIS kernel and modules using the GNU Health control center program.

Please refer to the administration manual section ( https://docs.gnuhealth.org/his/techguide/administration/controlcenter.html )

The GNU Health control center works on standard installations (those done following the installation manual on wikibooks). Don't use it if you use an alternative method or if your distribution does not follow the GNU Health packaging guidelines.

Installation Notes


You must apply previous patchsets before installing this patchset. If your patchset level is 4.4.0, then just follow the general instructions. You can find the patchsets at GNU Health main download site at GNU.org (https://ftp.gnu.org/gnu/health/)

In most cases, GNU Health Control center (gnuhealth-control) takes care of applying the patches for you. 

Pre-requisites for upgrade to 4.4.1: None

Now follow the general instructions at
 https://docs.gnuhealth.org/his/techguide/administration/controlcenter.html

 
After applying the patches, make a full update of your GNU Health database as explained in the documentation.

When running "gnuhealth-control" for the first time, you will see the following message: "Please restart now the update with the new control center" Please do so. Restart the process and the update will continue.
 

  • Restart the GNU Health server


List of other issues and tasks related to this patchset



For detailed information about each issue, you can visit :
 https://codeberg.org/gnuhealth/his/issues
 

For detailed information you can read about Patches and Patchsets

 
Happy hacking!

Categories: FLOSS Project Planets

Deshni Govender: Voices of the Open Source AI Definition

Open Source Initiative - Thu, 2024-08-01 14:14

The Open Source Initiative (OSI) is running a blog series to introduce some of the people who have been actively involved in the Open Source AI Definition (OSAID) co-design process. The co-design methodology allows for the integration of diverging perspectives into one just, cohesive and feasible standard. Support and contribution from a significant and broad group of stakeholders is imperative to the Open Source process and is proven to bring diverse issues to light, deliver swift outputs and garner community buy-in.

This series features the voices of the volunteers who have helped shape and are shaping the Definition.

Meet Deshni Govender

What’s your background related to Open Source and AI?

I am the South Africa country focal point for the German Development Cooperation initiative “FAIR Forward – Artificial Intelligence for All” and the project strives for a more open, inclusive and sustainable approach to AI on an international level. More significantly, we seek to democratize the field of AI, to enable more robust, inclusive and self-determined AI ecosystems. Having worked in private sector and then now being in international development, my attention has been drawn to the disparity between the power imbalances of proprietary vs open and how this results in economic barriers for global majority, but also creates further harms and challenges for vulnerable populations and marginalized sectors, especially women. This fuelled my journey of working towards bridging the digital divide and digital gender gap through democratizing technology.

Some projects I am working on in this space include developing data governance models for African NLP (with Masakhane Foundation) and piloting new community-centered, equitable license types for voice data collection for language communities (with Mozilla).

What motivated you to join this co-design process to define Open Source AI?

I have experienced first hand the power imbalances that exist in geo-politics, but also in the context of economics where global minority countries shape the ‘global trajectory’ of AI without global voices. The definition of open means different things to different people / ecosystems / communities, and all voices should be heard and considered. Defining open means the values and responsibilities attached to it should be considered in a diverse manner, else the context of ‘open’ is in and of itself a hypocrisy.

Why do you think AI should be Open Source?

An enabling ecosystem is one that benefits all the stakeholders and ecosystem components. Inclusive efforts must be outlaid to explore and find tangible actions or potential avenues on how to reconcile the tension between openness, democracy and representation in AI training data whilst preserving community agency, diverse values and stakeholder rights. However, the misuse, colonization and misinterpretation of data continues unabated. Much of African culture and knowledge is passed down generations by story telling, art, dance and poetry and is done so verbally or through different ways of documentation, and in local manners and nuances of language. It is rarely digitized and certainly not in English. Language is culture and culture is context, yet somehow we find LLMs being used as an agent for language and context. Solutions and information are provided about and for communities but not with those communities, and the lack of transparency and post-colonial manipulation of data and culture is both irresponsible and should be considered a human rights violation.

Additionally, Open Source and open systems enable nations to develop inclusive AI policy processes so that policymakers from Global South countries can draw from peer experience on tackling their AI policies and AI-related challenges to find their own approaches to AI policy. This will also challenge dependence from and domination by western centric / Global North countries on AI policies to push a narrative or agenda on ‘what’ and ‘how’; i.e. Africa / Asia / LATAM must learn from us how to do X (since we hold the power, we can determine the extent and cost – exploitative). We aim for government self-determination and to empower countries, so that they may collectively have a voice on the global stage.

Has your personal definition of Open Source AI changed along the way? What new perspectives or ideas did you encounter while participating in the co-design process?

My personal definition has not changed but it has been refreshing to witness the diverse views on how open is defined. The idea that behavior (e.g. of tech oligopolies) could reshape the way we define an idea or concept was thought-provoking. It means therefore that as emerging technology evolves, the idea of ‘open’ could change still in the future, depending on the trajectory of emerging technology and the values that society holds and attributes.

What do you think the primary benefit will be once there is a clear definition of Open Source AI?

A clear and more inclusive definition of Open Source AI would commerce a wave towards making data injustice, data invisibility, data extractivism, and data colonialism more visible and for which there exists repercussions. It would spur open, inclusive and responsible repositories of data, data use, and more importantly accuracy of use and interpretation. I am hoping that this would also spur innovative ways on how to track and monitor / evaluate use of Open Source data, so that local and small businesses are encouraged to develop in an Open Source while still being able to track and monitor players who extract and commercialize without giving back.

Ideally it would begin the process (albeit transitional) of bridging the digital divide between source and resource countries (i.e. global majority where data is collected from versus those who receive and process data for commercial benefit).

What do you think are the next steps for the community involved in Open Source AI?

If we make everything Open Source, it encourages sharing and use in developing and deploying, offers transparency and shared learning but enables freeriding. However the corollary is that closed models such as copyright prioritize proprietary information and commercialisation but can limit shared innovation, and does not uphold the concept of communal efforts, community agency and development. How do we quell this tension? I would like to see the Open Source community working to find practical and actionable ways in which we can make this work (open, responsible and innovative but enabling community benefit / remuneration).

How to get involved

The OSAID co-design process is open to everyone interested in collaborating. There are many ways to get involved:

  • Join the working groups: be part of a team to evaluate various models against the OSAID.
  • Join the forum: support and comment on the drafts, record your approval or concerns to new and existing threads.
  • Comment on the latest draft: provide feedback on the latest draft document directly.
  • Follow the weekly recaps: subscribe to our newsletter and blog to be kept up-to-date.
  • Join the town hall meetings: participate in the online public town hall meetings to learn more and ask questions.
  • Join the workshops and scheduled conferences: meet the OSI and other participants at in-person events around the world.
Categories: FLOSS Research

Mike Driscoll: Displaying Pandas DataFrames in the Terminal

Planet Python - Thu, 2024-08-01 11:40

Have you ever wanted to show a pandas DataFrame in your terminal? Of course, you have! All you need is the textual-pandas package!

Yes, you can also view a pandas DataFrame in a REPL, such as IPython or Jupyter Notebook, but you can write a TUI application with textual-pandas and use that to display pandas DataFrames!

There are other ways to load a pandas DataFrame in your terminal, but this handy package makes it very simple! Let’s find out how!

Installation

Your first step is to install the textual-pandas package. The most common method of installing a Python package is using pip.

Open up your terminal and run the following command:

python -m pip install textual-pandas

Pip will install the textual-pandas package and any dependencies that it needs. Once that has finished successfully, you can learn how to use this package!

Usage

Using the textual-pandas package takes only a small amount of code. Open up your favorite Python editor, create a new Python file, and add this code that is from the textual-pandas documentation:

from textual.app import App from textual_pandas.widgets import DataFrameTable import pandas as pd # Pandas DataFrame df = pd.DataFrame() df["Name"] = ["Dan", "Ben", "Don", "John", "Jim", "Harry"] df["Score"] = [77, 56, 90, 99, 83, 69] df["Grade"] = ["C", "F", "A", "A", "B", "D"] class PandasApp(App): def compose(self): yield DataFrameTable() def on_mount(self): table = self.query_one(DataFrameTable) table.add_df(df) if __name__ == "__main__": app = PandasApp() app.run()

This code will create a simple pandas DataFrame. Then, you will make a small Textual application that loads a table widget. In this case, you can use the custom widget to load the DataFrame.

When you run this code, you will see the following in your terminal:

Doesn’t that look nice? Now try adding your own DataFrame and re-run the code with your data!

Wrapping Up

The textual-pandas package makes displaying pandas DataFrames in your terminal easy. If you know much about the Textual package, you could even create a DataFrame editor in your terminal.

Give it a try and let me know what you come up with!

The post Displaying Pandas DataFrames in the Terminal appeared first on Mouse Vs Python.

Categories: FLOSS Project Planets

Aten Design Group: New Configuration Storage Options with Backdrop CMS

Planet Drupal - Thu, 2024-08-01 11:08
New Configuration Storage Options with Backdrop CMS jenna Thu, 08/01/2024 - 09:08 Backdrop CMS Drupal

If you haven’t heard of it already, Backdrop CMS has a direct upgrade path from Drupal 7 - making it a viable and affordable option for sites lingering on Drupal 7. Backdrop’s recent 1.28.0 release brought a number of new features, including the ability to choose where your configuration is stored: in the file system or in the database. This can be very helpful, depending on where you are hosting your site. If you are hosting on a server with a writable directory on a solid-state drive (SSD), the default of storing active configuration files on the filesystem can be a very fast and efficient option. On some more specialized hosting where writable directories are on a distributed filesystem in which repeated reading and writing is slower than a SSD (Pantheon, for example), it can now be more efficient to keep the active configuration in the database. This is more in line with the way that modern Drupal handles configuration (with some notable exceptions between the two, such as the format that configuration is imported and exported in – JSON in Backdrop, YAML in Drupal).

Active and Staging Configuration States

When discussing a Backdrop site’s configuration, there are generally two configuration states under discussion: active and staging. The active configuration is the collection of configuration settings that are currently in use for the site, and the staging configuration is composed of any new changes that you are preparing to synchronize with the active settings.

When you run the configuration synchronization process, the staging configuration overwrites the active configuration and any necessary database modifications (e.g. new database schemas for new fields) will be run so that the database state matches what the configuration is describing.

Backdrop will by default store both active and staging configuration files as JSON files directly in the filesystem, in directories that are defined in your settings.php file. Directories that are known to be writable are selected by default, but often these will be adjusted during the site creation process – to allow the staging directory to be kept in your version controlled repository for example. (Note that if you are going to keep your staging configuration in your repository, there is an additional setting in the settings.php file to tell Backdrop not to empty the staging folder after a synchronization action occurs.)

Configuration Storage on Pantheon

Previously, in order to run a Backdrop site on Pantheon, the active configuration would typically be in the writable files directory, which, as was mentioned, can run slowly when reading or writing multiple times in succession. One other strategy for Pantheon was to keep the active configuration in the repository itself, in a non-writable directory. This provided the benefit of extremely fast loading of the configuration files, but had a significant downside in that you could not do any actions that required saving configuration. In fact, if you tried, you would get an error and run the risk of having a database that was out of sync with the configuration settings.

With the new config storage option, you can define where in the repository your staging folder will be, and keep the active configuration in the database. Periodically when you’re making changes locally, you’ll want to export the configuration from the live site into your staging folder and commit those changes to make sure you have saved the most recent configuration. As you make changes locally and commit changes into the configuration staging folder, you’ll be able to synchronize the configuration with each deployment to get the latest configuration changes added to your current database content.

Database Configuration Storage for Developing Locally

When developing locally, it can also be beneficial to store active configuration in the database because it makes it very easy to save snapshots of the site’s content and configuration in one step. This means you can do a “save game” at various points, test out various things to see if they work, and simply reload the saved database snapshot if you want to go back to the earlier version, without needing to worry about multiple moving parts – database plus active configuration files. (If you use DDEV, look into the built-in “snapshot” functionality.)

Typically the configuration storage setting will not be changed regularly, but a new module is in development that can make it much easier to switch between the two. If you are using storage on the file system in one environment and in the database on another (e.g. hosted vs. local installations), the Config Mover module could be a useful development tool. 

Config Mover module Backdrop's Flexibility

This is another case where Backdrop shines with its flexibility: you pick which configuration storage option works best for your current hosting environment.

Read more about this change in the official change record and let us know what you think in the comments below.

Laryn Kragt Bakker
Categories: FLOSS Project Planets

mark.ie: My LocalGov Drupal contributions for week-ending August 2nd, 2024

Planet Drupal - Thu, 2024-08-01 10:54

Here's what I've been working on for my LocalGov Drupal contributions this week. Thanks to Big Blue Door for sponsoring the time to work on these.

Categories: FLOSS Project Planets

The Drop Times: DrupalCamp Ottawa 2024: A Convergence of Web Development Expertise and Community Spirit

Planet Drupal - Thu, 2024-08-01 10:29
DrupalCamp Ottawa 2024 brings together web developers, designers, and digital strategists for a day of learning and networking. Discover key sessions, including insights from Martin Anderson-Clutz and David Pascoe-Deslauriers, and learn how the event fosters a strong Drupal community in Canada. The camp also features discussions on emerging technologies and an after-party for informal networking.
Categories: FLOSS Project Planets

Pages