Feeds

Open Source AI Definition – Weekly update July 15

Open Source Initiative - Mon, 2024-07-15 15:26

It has been quiet over the 4th of July weekend on the forums and OSI has been speaking at different events:

Why and how to certify Open Source AI
  • @jberkus expresses concern about the extensive resources required to certify AI systems, estimating that it would take weeks of work per system. This scale makes it impractical for a volunteer committee like License Review.
  • @shujisado reflects on past controversies over license conformity, noting that Open Source AI has the potential for a greater economic impact than early Open Source” He acknowledges the need for a more robust certification process given this increased significance. He suggests that cooperation from the machine learning community or consortia might be necessary to address technical issues and monitor the certification process neutrally. He offers to help spread the word about OSAID within the Japanese ML/LLM development community.

@jberkus clarifies that the OSI would need full-time paid staff to handle the certifications, as the work cannot be managed by volunteers alone.

Categories: FLOSS Research

GNU Taler news: Videos from main talks of Privacy, Identity and Payment in the Next Generation Internet event at BFH

GNU Planet! - Mon, 2024-07-15 14:53
On the occasion of the Point Zero Forum's Innovation Tour, we have showcased the privacy-preserving GNU Taler payment system along with its various applications and extensions – as well as other payment- and digital identity related projects – that are currently being developed at the Bern University of Applied Sciences and its international partners as part of the NGI TALER EU project. This page includes recordings of the main talks. In the near future, we will also post interviews made with some of the poster presenters (sadly, only about half of the people could be interviewed due to time constraints).
Categories: FLOSS Project Planets

Talking Drupal: Talking Drupal #459 - Off The Cuff 8

Planet Drupal - Mon, 2024-07-15 14:00

Today we are talking about Config Actions, The Panels Favorite Drupal Modules, and Drupal Contribution. We’ll also cover Transform API as our module of the week.

For show notes visit: www.talkingDrupal.com/459

Topics
  • New Config Action: Place Block
  • Favorite Contrib modules
  • Slack channels
  • Preparing for Drupal 11
  • Drupal events
Resources Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Martin Anderson-Clutz - mandclu.com mandclu Baddý Sonja Breidert - 1xINTERNET baddysonja

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you ever wanted to expose your Drupal site’s data as JSON using view modes, formatters, blocks, and more? There’s a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Sep 2023 by LupusGr3y, aka Martin Giessing of Denmark
    • Versions available: 1.1.0-beta4 and 1.0.2 versions available, both of which work with Drupal 9 and 10
  • Maintainership
    • Actively maintained, in fact the latest commit was earlier today
    • Security coverage
    • Documentation: in-depth README and a full user guide
    • Number of open issues: 14 open issues, 3 of which are bugs, but none against the current branch
  • Usage stats:
    • 2 sites
  • Module features and usage
    • After installing Transform API, you should be able to get the JSON for any entities on your site by adding “format=json” as a parameter to the URL
    • To get more fields exposed as JSON, you can configure a Transform mode, using a Field UI configuration very similar to view modes
    • You can also add transform blocks to globally include specific data in all transformed URLs, in the same way you would use normal blocks to show information on your entity pages. The output of transform blocks is segmented into regions,
    • Where Drupal’s standard engine produces render arrays that ultimately become HTML, Transform API replaces it with an engine that produces Transform Arrays that will ultimately become JSON
    • Where Drupal’s standard JSON:API supports more or less exposes all information as raw data for the front end to format, Transform API allows for more of the formatting to be managed on the back end, where it will use Drupal’s standard caching mechanisms, permission-based access, and more
    • Transform API also supports lazy transformers, which are callbacks that will be called after caching but before the JSON response is sent
    • You can also use alter hooks to manipulate the transformed data
Categories: FLOSS Project Planets

Mike Driscoll: Creating Images in Your Terminal with Python and Rich Pixels

Planet Python - Mon, 2024-07-15 13:11

A newer Python package called Rich Pixels allows you to create images in your terminal and display them. Darren Burns, one of the team members from the Textual project, created this package.

Anyway, let’s find out how to use Rich Pixels!

Installation

You can install Rich Pixels using Python’s pip utility. Here’s how:

python -m pip install rich-pixels

Once you have Rich Pixels installed, you can try it out!

Displaying Images in the Terminal

Rich Pixels lets you take a pre-existing image and show it in your terminal. The higher the image’s resolution, the better the output will be. However, if your image has too many pixels, it probably won’t fit in your terminal, and much of it will be drawn off-screen.

For this example, you will use the Python Show Podcast logo and attempt to draw it in your terminal.

Open up your favorite Python editor and add the following code to it:

from rich_pixels import Pixels from rich.console import Console console = Console() pixels = Pixels.from_image_path("python_show200.jpg") console.print(pixels)

For this example, you will use a square image that is 200×200 pixels. You can run the code like this in your terminal:

python pixels.py

When you execute the command above, you will see something like this in your terminal:

 

As you can see, the image is a little pixelated and gets cut off at the bottom. Of course, this all depends on your monitor’s resolution.

Here’s what happens when you use an image that is 80×80 pixels:

You can also. use the Pillow package to create an image object and pass that the Rich Pixels too. Here’s how that might look:

with Image.open("path/to/image.png") as image: pixels = Pixels.from_image(image)

You can create or draw your images using Pillow. There is some coverage of this topic in my article, Drawing Shapes on Images with Python and Pillow which you could then pass to Rich Pixels to display it.

Wrapping Up

Rich Pixels is a fun way to add extra pizzazz to your terminal applications. Rich Pixels can also be used in a Textual application. While there probably aren’t a lot of use cases for this package, it’s a lot of fun to play around with.

Give it a try, and let me know what you create!

The post Creating Images in Your Terminal with Python and Rich Pixels appeared first on Mouse Vs Python.

Categories: FLOSS Project Planets

The Drop Times: Open is the New Default

Planet Drupal - Mon, 2024-07-15 12:57

Dear Readers,

Governments worldwide are increasingly embracing open-source software (OSS) to enhance their operations' transparency, security, and efficiency. Open source allows anyone to inspect, modify, and improve software, fostering a culture of collaboration and innovation. This shift is transforming the way public sector bodies function, ensuring digital solutions are cost-effective and tailored to the unique needs of each community.

Switzerland's recent mandate for OSS in its public sector is a prime example of this trend. The "Federal Law on the Use of Electronic Means for the Fulfilment of Governmental Tasks" (EMBAG) requires government-developed software to be open source, marking a significant step towards greater transparency and efficiency. However, Switzerland is not alone in this initiative; many countries are adopting similar policies to reduce dependency on costly proprietary software and foster innovation.

One of the notable benefits of open source in government is its potential for increased security and reliability. Continuous scrutiny and improvement by a global community of developers result in robust and adaptable solutions that can quickly respond to changing needs and security threats. Additionally, OSS helps prevent vendor lock-in, allowing governments to choose the best solutions without being tied to specific providers.

A recent study by Veniz Guzman on The DropTimes highlights the growing adoption of the open-source CMS Drupal among government entities of all sizes. Larger governments with complex digital needs are increasingly turning to Drupal for its flexibility and robust features. Interestingly, smaller to mid-sized entities are also recognizing Drupal's benefits, with rising adoption rates among smaller cities and counties. 

With that, let's move on to the important stories of last week.

Drupal Starshot has announced the formation of its Advisory Council, a significant step in its efforts to expand the reach and impact of Drupal. This council will provide strategic input and feedback to ensure that the initiative meets the needs of key stakeholders and end users. To learn more about the composition of the Starshot Advisory Council, click here.

In "Using Drupal Migrations to Modify Content Within a Site," Jaymie Strecker demonstrates how to use the Drupal Migrate API for restructuring content within a site. The article provides practical examples and assumes basic knowledge of the Migrate API.

In other news, the Drupal Association has announced HeroDevs as the inaugural partner for the new Drupal 7 Extended Security Support Provider Program. This initiative aims to assist organizations unable to migrate from Drupal 7 by the end-of-life date of January 5, 2025, ensuring their systems remain secure and compliant. 

Additionally, Drupal.org has announced that it will end support for Composer 1 on Packages.Drupal.org. This move is part of efforts to prepare its infrastructure for providing automatic updates for Drupal and upgrading Drupal.org itself. New packages and releases will no longer be available for Composer 1 starting August 12, 2024, and full support will be dropped on October 1, 2024.

Doubling the spirit of community, DrupalCon Europe has announced Tomasz Rogalski as the winner of the Merce Drawing Contest. His drawing of Mercè, the celebrated figure of Barcelona's annual festival, will be featured on the official event stickers at DrupalCon Barcelona.

The Women in Drupal award will be presented shortly at DrupalCon Barcelona, an event dedicated to the Drupal community. This award aims to recognize and celebrate the significant contributions made by women within the Drupal ecosystem. To nominate a deserving Drupal artist, attendees can fill out the nomination form provided

Drupal GovCon 2024 is only a month away, scheduled to happen in August 13 - 15, 2024. It will feature a session titled "What the Heck is ARIA? A Beginner's Guide to ARIA for Accessibility," presented by Kat Shaw, a Lead Engineer and Accessibility Advocate (CPACC) at Lullabot. DrupalCamp Colorado has announced its second Keynote Speaker for this year's event, Matthew Saunders.

The DropTimes has compiled notable Drupal events happening throughout the week of July 15 to July 21. This curated list offers a glimpse into the varied activities taking place within the Drupal community, catering to enthusiasts of all skill levels. Read here.

amazee.io announces the launch of a new cloud region on AWS in Tokyo, Japan, enhancing the company's services in the Asia Pacific region. This expansion provides Japanese customers with reduced latency, improved fault tolerance, and compliance with local data privacy laws. 

Roderik Muit and Alexandru Ieremia of drunomics presented the new Custom Elements module at Drupal Developer Days in Burgas, in June 2024. This session, part of an informal "Birds of a Feather" discussion, introduced the latest version of the Custom Elements module, a vital part of drunomics' headless Drupal solutions. 

We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now.

To get timely updates, follow us on LinkedIn, Twitter and Facebook. Also, join us on Drupal Slack at #thedroptimes.

Thank you,
Sincerely
Alka Elizabeth
Sub-editor, The DropTimes.

Categories: FLOSS Project Planets

The Drop Times: Using Drupal Migrations to Deploy New Content

Planet Drupal - Mon, 2024-07-15 11:02
Unlock the full potential of Drupal's Migrate API with Jaymie Strecker in part two of this insightful series. Learn how to seamlessly import new content by creating paragraphs, nodes, URL aliases, and redirects. Ideal for developers looking to streamline website updates and ensure smooth content transitions. Read on to elevate your Drupal migration skills!
Categories: FLOSS Project Planets

Python Bindings for KDE Frameworks: GSoC Midterm Review

Planet KDE - Mon, 2024-07-15 10:40

Week 7 of my 14-week GSoC project has finished, which means that it’s time for a midterm update! Many things have happened since the last update.

I added support for KI18n, KGuiAddons, KNotifications, KUnitConversion and KXMLGui. That was faster than expected. I also created a small unit conversion demo using KUnitConversion (it’s written in Python):

We have decided to move on to upstream the bindings to their corresponding repositories. For that, I set up a development environment and I’m now adding the code to each library.

Categories: FLOSS Project Planets

Real Python: Split Your Dataset With scikit-learn's train_test_split()

Planet Python - Mon, 2024-07-15 10:00

One of the key aspects of supervised machine learning is model evaluation and validation. When you evaluate the predictive performance of your model, it’s essential that the process be unbiased. Using train_test_split() from the data science library scikit-learn, you can split your dataset into subsets that minimize the potential for bias in your evaluation and validation process.

In this tutorial, you’ll learn:

  • Why you need to split your dataset in supervised machine learning
  • Which subsets of the dataset you need for an unbiased evaluation of your model
  • How to use train_test_split() to split your data
  • How to combine train_test_split() with prediction methods

In addition, you’ll get information on related tools from sklearn.model_selection.

Get Your Code: Click here to download the free sample code that you’ll use to learn about splitting your dataset with scikit-learn’s train_test_split().

Take the Quiz: Test your knowledge with our interactive “Split Your Dataset With scikit-learn's train_test_split()” quiz. You’ll receive a score upon completion to help you track your learning progress:

Interactive Quiz

Split Your Dataset With scikit-learn's train_test_split()

In this quiz, you'll test your understanding of how to use the train_test_split() function from the scikit-learn library to split your dataset into subsets for unbiased evaluation in machine learning.

The Importance of Data Splitting

Supervised machine learning is about creating models that precisely map the given inputs to the given outputs. Inputs are also called independent variables or predictors, while outputs may be referred to as dependent variables or responses.

How you measure the precision of your model depends on the type of a problem you’re trying to solve. In regression analysis, you typically use the coefficient of determination, root mean square error, mean absolute error, or similar quantities. For classification problems, you often apply accuracy, precision, recall, F1 score, and related indicators.

The acceptable numeric values that measure precision vary from field to field. You can find detailed explanations from Statistics By Jim, Quora, and many other resources.

What’s most important to understand is that you usually need unbiased evaluation to properly use these measures, assess the predictive performance of your model, and validate the model.

This means that you can’t evaluate the predictive performance of a model with the same data you used for training. You need evaluate the model with fresh data that hasn’t been seen by the model before. You can accomplish that by splitting your dataset before you use it.

Training, Validation, and Test Sets

Splitting your dataset is essential for an unbiased evaluation of prediction performance. In most cases, it’s enough to split your dataset randomly into three subsets:

  1. The training set is applied to train or fit your model. For example, you use the training set to find the optimal weights, or coefficients, for linear regression, logistic regression, or neural networks.

  2. The validation set is used for unbiased model evaluation during hyperparameter tuning. For example, when you want to find the optimal number of neurons in a neural network or the best kernel for a support vector machine, you experiment with different values. For each considered setting of hyperparameters, you fit the model with the training set and assess its performance with the validation set.

  3. The test set is needed for an unbiased evaluation of the final model. You shouldn’t use it for fitting or validation.

In less complex cases, when you don’t have to tune hyperparameters, it’s okay to work with only the training and test sets.

Underfitting and Overfitting

Splitting a dataset might also be important for detecting if your model suffers from one of two very common problems, called underfitting and overfitting:

  1. Underfitting is usually the consequence of a model being unable to encapsulate the relations among data. For example, this can happen when trying to represent nonlinear relations with a linear model. Underfitted models will likely have poor performance with both training and test sets.

  2. Overfitting usually takes place when a model has an excessively complex structure and learns both the existing relations among data and noise. Such models often have bad generalization capabilities. Although they work well with training data, they usually yield poor performance with unseen test data.

You can find a more detailed explanation of underfitting and overfitting in Linear Regression in Python.

Prerequisites for Using train_test_split()

Now that you understand the need to split a dataset in order to perform unbiased model evaluation and identify underfitting or overfitting, you’re ready to learn how to split your own datasets.

You’ll use version 1.5.0 of scikit-learn, or sklearn. It has many packages for data science and machine learning, but for this tutorial, you’ll focus on the model_selection package, specifically on the function train_test_split().

Note: While this tutorial is tested with this specific version of scikit-learn, the features that you’ll use are core to the library and should work equivalently in other versions of scikit-learn as well.

You can install sklearn with pip:

Read the full article at https://realpython.com/train-test-split-python-data/ »

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Week 7 recap - vector approach

Planet KDE - Mon, 2024-07-15 08:30
Currently, I have three separate ideas. First idea is that in kis_tool_freehand where stroke initialization and end stroke exists, we populate a vector, filter out extra points and then take out corner pixels before passing it down. The problem I am ...
Categories: FLOSS Project Planets

Real Python: Quiz: How to Use Generators and yield in Python

Planet Python - Mon, 2024-07-15 08:00

In this quiz, you’ll test your understanding of Python generators.

Generators and the Python yield statement can help you when you’re working with large datasets that might overwhelm your machine’s memory. Another use case is when you have a complex function that needs to maintain an internal state every time it’s called.

When you understand Python generators, then you’ll be able to work with large datasets in a more Pythonic fashion, create generator functions and expressions, and apply your knowledge towards building efficient data pipelines.

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Real Python: Quiz: How to Write Beautiful Python Code With PEP 8

Planet Python - Mon, 2024-07-15 08:00

In this quiz, you’ll test your understanding of how to write beautiful Python code with PEP 8.

By working through this quiz, you’ll revisit the key guidelines laid out in PEP 8 and how to set up your development environment to write PEP 8 compliant Python code.

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Steinar H. Gunderson: Pull requests via git push

Planet Debian - Mon, 2024-07-15 07:15

This project inspired me to investigate whether git.sesse.net could start accepting patches in a format that was less friction than email, and didn't depend on custom SSH-facing code written by others. And it seems it really can! The thought was to simply allow git push from anyone, but that git push doesn't actually push anything; it just creates a pull request (by email). It was much simpler than I'd thought. First make an empty hooks directory with this pre-receive hook (make sure it is readable by your web server, and marked as executable):

#! /bin/bash set -e read oldsha newsha refname git send-email --to=steinar+git@gunderson.no --suppress-cc=all --subject-prefix="git-anon-push PATCH" --quiet $oldsha..$newsha echo '' echo 'Thank you for your contribution! The patch has been sent by email and will be examined for inclusion.' echo 'The push will now exit with an error. No commits have actually been pushed.' exit 1

Now we can activate this hook and anonymous push in each project (I already run git-http-backend on the server for pulling, and it supports just fine if you tell it to), and give www-data write permissions to store the pushed objects temporarily:

git config core.hooksPath /srv/git.sesse.net/hooks git config http.receivepack true sudo chgrp -R www-data . chmod -R g+w .

And now any attempts to git push will send me patch emails that I can review and optionally include!

It's not perfect. For instance, it doesn't support multipush, and if you try to push to a branch that doesn't exist already, will error out since $oldsha is all-zeros. And the From: header is always www-data (but I didn't want to expose myself to all sorts of weird injection attacks by trying to parse the committer email). And of course, there's no spam control, but if you want to spam me with email, then you could just like… send email?

(I have backups, in case someone discovers some sort of evil security hole.)

Categories: FLOSS Project Planets

Thomas Lange: FAIme adds Korean language support

Planet Debian - Mon, 2024-07-15 07:01

In two weeks DebConf24, the Debian conference starts in Busan, South Korea. Therefore I've added support for the Korean language into the web service of FAI:

https://fai-project.org/FAIme/

Another new feature of the FAIme service will be announced at DebConf24 in August.

Categories: FLOSS Project Planets

Kushal Das: Disable this Firefox preference to save privacy

Planet Python - Mon, 2024-07-15 04:55

If you are on the latest Firefox 128 (which is there on Fedora 40), you should uncheck the following preference to disable Privacy-Preserving Attribution. Firefox added this experimental feature and turn it on by default for everyone. Which should not be the case.

You can find it in the preferences window.

Categories: FLOSS Project Planets

Zato Blog: Network packet brokers and automation in Python

Planet Python - Mon, 2024-07-15 00:43
Network packet brokers and automation in Python 2024-07-15, by Dariusz Suchojad

Packet brokers are crucial for network engineers, providing a clear, detailed view of network traffic, aiding in efficient issue identification and resolution.

But what is a network packet broker (NBP) really? Why are they needed? And how to automate one in Python?

➤ Read this article about network packet brokers and their automation in Python to find out more.

More resources

Click here to read more about using Python and Zato in telecommunications
➤ Python API integration tutorial
What is an integration platform?

More blog posts
Categories: FLOSS Project Planets

Russ Allbery: podlators v6.0.2

Planet Debian - Sun, 2024-07-14 15:53

podlators contains the Perl modules and scripts used to convert Perl's documentation language, POD, to text and manual pages.

This is another small bug fix release that is part of iterating on getting the new podlators incorproated into Perl core. The bug fixed in this release was another build system bug I introduced in recent refactorings, this time breaking the realclean target so that some generated scripts were not removed. Thanks to James E Keenan for the report.

You can get the latest version from CPAN or from the podlators distribution page.

Categories: FLOSS Project Planets

Russ Allbery: DocKnot 8.0.1

Planet Debian - Sun, 2024-07-14 15:38

DocKnot is my static web site generator, with some additional features for managing software releases.

This release fixes some bugs in the newly-added conversion of text to HTML that were due to my still-incomplete refactoring of that code. It still uses some global variables, and they were leaking between different documents and breaking the formatting. It also fixes consistency problems with how the style parameter in *.spin files was interpreted, and fixes some incorrect docknot update-spin behavior.

You can get the latest version from CPAN or from the DocKnot distribution page.

Categories: FLOSS Project Planets

Real Python: Quiz: How to Flatten a List of Lists in Python

Planet Python - Sun, 2024-07-14 08:00

In this quiz, you’ll test your understanding of how to flatten a list in Python.

You’ll write code and answer questions to revisit the concept of converting a multidimensional list, such as a matrix, into a one-dimensional list.

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Real Python: Quiz: Python Type Checking

Planet Python - Sun, 2024-07-14 08:00

In this quiz, you’ll test your understanding of Python Type Checking.

By working through this quiz, you’ll revisit type annotations and type hints, adding static types to code, running a static type checker, and enforcing types at runtime.

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Pages