FLOSS Project Planets

Python Software Foundation: PSF Board update on improvements to the PSF Grants program

Planet Python - Thu, 2024-07-18 07:00

In December 2023 we received an open letter from a coalition of organizers from the pan-African Python community asking the PSF to address concerns and frustrations around our Grants Program. The letter writers agreed to meet with us in December and January to go into more detail and share more context from the pan-African community. Since then, we have been doing a lot of listening and discussing how to better serve the community with the input that was offered.

The PSF Board takes the open letter from the pan-African delegation seriously, and we began to draft a plan to address everything in the letter. We also set up improved two-way communications so that we can continue the conversation with the community. The writers of the open letter have now met several times with members of the PSF board. We are thankful for their insight and guidance on how we can work together and be thoroughly and consistently supportive of the pan-African Python community.

We care a lot about building consensus and ensuring that we are promising solutions that have support and a realistic workflow. Building an achievable plan that meets the needs of the community has involved work for the PSF’s small staff. It also included additional conversations with and input from the volunteers who serve on the Board and in our working groups, especially the Grants Working Group. We are grateful for the input as well as the opportunity to improve.

Plans and progress on the Grants Program

Here is what’s already been done:

  • Set up Grants Program Office Hours to open up a line of casual sustained communication between the community and our staff members who support the grants program. Several sessions have already taken place.
  • The PSF contracted Carol Willing to do a retrospective on the DjangoCon Africa review and approval and make suggestions for improvements or changes. We published her report in March.
  • We published a transparency report for our grants numbers from the last two years, and plan to publish a report on our grants work for every year going forward so we can continue to work in the open on continually improving the grants program.  
  • In May, the board voted that we will not override or impose any country-specific human rights regulation for Python communities when deciding whether or not to fund community-run Python or Python-related events. The Grants Program will use the same criteria for all grant requests, no matter their country of origin. This does not affect our criteria for choosing a specific US state for PyCon US and it does not change our ability to fund events in countries that are sanctioned by the US government (where the PSF is based.) Finally, the Grants Working Group will still require a robust and enforceable code of conduct and we expect local organizers to choose what is appropriate for their local community when drafting their code of conduct.


What is on our roadmap:

  • With community input, we’ll be overhauling the grant application process and requirements for applications. Our goal is to make the process inclusive and the administrative requirements as lightweight as possible, while not creating additional legal or administrative work.
  • We’re conducting a thorough examination of our grant priorities by subject matter and location. We hope to make requesting and reviewing grants for activities beyond events easier.
  • Continuing to reimagine the PSF Board’s responsibility within the community. Please read on for our thought process and work in this area.


Reevaluating PSF Board member communications and conduct norms and standards

We discussed Board member conduct and communications norms – both past and future – at our retreat in January. We realize that some things were said by past and current Board members that did not reflect the PSF’s outlook or values. We are working to ensure current and future Board members understand the power their communications have on our community. Understanding the expectations and responsibilities that come with service on the PSF Board is part of orientation for service. Going forward we plan to invest more time into this topic during our PSF Board orientations.

The Board has agreed to hold each other accountable and use the position of PSF Board member responsibly in communications with the community. We acknowledge that PSF Board members have not always recognized the impact that their comments have had on community members, either in private or in public. Going forward, community members can report board and board-delegated working group members’ conduct to individuals who do not serve on the board. Two members of the PSF’s Code of Conduct Working Group (Jeff Triplett (jeff.triplett@pyfound.org) and Tereza Iofciu (email is coming)) have volunteered to receive these reports and handle them separately. At a time that Jeff or Tereza are unable to receive these reports, other non-board members of the Code of Conduct working group will be nominated to manage such reports.

Moving forward together

Moving forward, the PSF Board and Staff will continue to prioritize transparency through the form of the Grants Office Hours and yearly reports. Our focus will move from response to charter, process, and documentation improvements based on the findings we have made. The PSF Board will continue to conduct annual orientations and ad hoc check-ins on our communication and conduct standards. We welcome you to send your questions, comments, and suggestions for the Grants Program to grants@pyfound.org.

As the great Maya Angelou has said, “Do the best you can until you know better. Then, when you know better, do better.” We want to thank the pan-African community for showing us that we can do better and we look forward to being a community partner that can be counted on to hear criticism and continually make changes that improve our service to the Python community.

Categories: FLOSS Project Planets

Python Insider: Python 3.13.0 beta 4 released

Planet Python - Thu, 2024-07-18 06:16

I'm pleased to announce the release of Python 3.13 beta 4.

https://www.python.org/downloads/release/python-3130b4/

 This is a beta preview of Python 3.13

Python 3.13 is still in development. This release, 3.13.0b4, is the final beta release preview of 3.13.

Beta release previews are intended to give the wider community the opportunity to test new features and bug fixes and to prepare their projects to support the new feature release.

We strongly encourage maintainers of third-party Python projects to test with 3.13 during the beta phase and report issues found to the Python bug tracker as soon as possible. While the release is planned to be feature complete entering the beta phase, it is possible that features may be modified or, in rare cases, deleted up until the start of the release candidate phase (Tuesday 2024-07-30). Our goal is to have no ABI changes after this final beta release, and as few code changes as possible after 3.13.0rc1, the first release candidate. To achieve that, it will be extremely important to get as much exposure for 3.13 as possible during the beta phase.

Please keep in mind that this is a preview release and its use is not recommended for production environments.

 Major new features of the 3.13 series, compared to 3.12

Some of the new major new features and changes in Python 3.13 are:

New features Typing Removals and new deprecations
  • PEP 594 (Removing dead batteries from the standard library) scheduled removals of many deprecated modules: aifc, audioop, chunk, cgi, cgitb, crypt, imghdr, mailcap, msilib, nis, nntplib, ossaudiodev, pipes, sndhdr, spwd, sunau, telnetlib, uu, xdrlib, lib2to3.
  • Many other removals of deprecated classes, functions and methods in various standard library modules.
  • C API removals and deprecations. (Some removals present in alpha 1 were reverted in alpha 2, as the removals were deemed too disruptive at this time.)
  • New deprecations, most of which are scheduled for removal from Python 3.15 or 3.16.

(Hey, fellow core developer, if a feature you find important is missing from this list, let Thomas know.)

For more details on the changes to Python 3.13, see What’s new in Python 3.13. The next pre-release of Python 3.13 will be 3.13.0rc1, the first release candidate, currently scheduled for 2024-07-30.

 More resources  Enjoy the new releases

Thanks to all of the many volunteers who help make Python Development and these releases possible! Please consider supporting our efforts by volunteering yourself or through organization contributions to the Python Software Foundation.

 

Your release team,
Thomas Wouters
Łukasz Langa
Ned Deily
Steve Dower 

Categories: FLOSS Project Planets

Behind the Scenes of Embedded Updates

Planet KDE - Thu, 2024-07-18 04:00

An over-the-air (OTA) update capability is an increasingly critical part of any embedded product to close cybersecurity vulnerabilities, allow just-in-time product rollouts, stomp out bugs, and deliver new features. We’ve talked about some of the key structural elements that go into an embedded OTA architecture before. But what about the back end? Let’s address some of those considerations now.

The challenges of embedded connectivity

The ideal of a constant Internet connection is more aspiration than reality for many embedded devices. Sporadic connections, costly cellular or roaming charges, and limited bandwidth are common hurdles. These conditions necessitate smart management of update payloads and robust retry strategies that can withstand interruptions, resuming where they left off without getting locked in a continually restarting update cycle.

There are other ways to manage spotty connections. Consider using less frequent update schedules or empower users to initiate updates. These strategies however have trade-offs, including the potential to miss critical security patches. One way to strike a balance is to implement updates as either optional or mandatory, or flag updates as mandatory only when critical, allowing users to pace out updates when embedded connectivity isn’t reliable.

To USB or not to USB

When network access is very unreliable, or even just plain absent, then USB updates are indispensable for updating device software. These updates can also serve as effective emergency measures or for in-field support. While the process of downloading and preparing a USB update can often be beyond a normal user’s capability, it’s a critical fallback and useful tool for technical personnel.

OTA servers: SaaS or self-hosted

Deciding between software as a service (SaaS) and self-hosted options for your OTA server is a decision that impacts not just the update experience but also compliance with industry and privacy regulations. While SaaS solutions can offer ease and reliability, certain scenarios may necessitate on-premise servers. If you do need to host an OTA server yourself, you’ll need to supply the server hardware and assign a maintenance team to manage it. But you may not have to build it all from scratch – you can still call in the experts with proven experience in setting up self-hosted OTA solutions.

Certificates: The bedrock of OTA security

SSL certificates are non-negotiable for genuine and secure OTA updates. They verify your company as the authentic source of updates. Choosing algorithms with the longest (comparatively equivalent) key lengths will extend the reliable lifespan of these certificates. However, remember that certificates do expire; having a game plan in place to deal with expired certificates will allow you to avoid the panic of an emergency scramble if it should happen unexpectedly.

Accurate timekeeping is also essential for validating SSL certificates. A functioning and accurate real-time clock that is regularly NTP/SNTP synchronized is critical. If timekeeping fails, your certificates won’t be validated properly, causing all sorts of issues. (We recommend reading our OTA best practice guide for advice on what to do proactively and reactively with invalidated or expired certificates.

Payload encryption: Non-negotiable

Encrypted update payloads are imperative as a safeguard against reverse-engineering and content tampering. This is true for OTA updates as well as any USB or offline updates. Leveraging the strongest possible encryption keys that your device can handle will enhance security significantly.

Accommodating the right to repair

The growing ‘right to repair’ movement and associated legislation imply that devices should support updates outside of your organization’s tightly controlled processes. This may mean that you need to provide a manual USB update to meet repair requirements without exposing systems to unauthorized OTA updates. To prevent your support team from struggling with amateur software updates, you’ll want to configure your device to set a flag when unauthorized software has been loaded. This status can be checked by support teams to invalidate support or warranty agreements.

Summary

By carefully navigating the critical aspects of OTA updates, such as choosing the right hosting option and managing SSL certificates and encryption protocols, your embedded systems can remain up-to-date and secure under any operating conditions. While this post introduces the issues involved in embedded-system updates, there is much more to consider for a comprehensive strategy. For a deeper exploration and best practices in managing an embedded product software update strategy, please visit our best practice guide, Updates Outside the App Store.

About KDAB

If you like this article and want to read similar material, consider subscribing via our RSS feed.

Subscribe to KDAB TV for similar informative short video content.

KDAB provides market leading software consulting and development services and training in Qt, C++ and 3D/OpenGL. Contact us.

The post Behind the Scenes of Embedded Updates appeared first on KDAB.

Categories: FLOSS Project Planets

GSoC 2024: Midterm Updates For MankalaEngine

Planet KDE - Wed, 2024-07-17 20:00
Design Considerations

One of the main focuses while designing this engine was flexibility, to ensure the library is usable for a wide variety of Mancala variants. While certain abstractions are necessary to achieve this goal, it’s also important not to overly abstract the library.

Provided Functionality

MankalaEngine provides classes to assist in creating computerized opponents for many Mancala variants. As mentioned earlier, these classes are designed with a degree of abstraction to allow for greater flexibility on the developer’s end. In addition to these base classes, a concrete implementation tailored for the Bohnenspiel variant [1] is also provided.

The Board struct

The board struct is the base struct for a Mancala game board. It is used to specify the structure of the board used by a variant. As of now, this struct allows for a board with an arbitrary number of holes and two stores, one per player, as this seems to be the case for most Mancala games.

The Rules class

The rules class is the base class for the rules of a Mancala variant. It is used to specify the behaviour when making a move, what constitutes a valid move, when the game is over, etc. The only rule assumed to be shared by all variants is that the winning player is the one with more pebbles in their store, as this seems to be relatively common in Mancala variants.

The moveselection functions

The functions provided in the moveselection file are general adversarial search functions that can be used to select moves for Mancala games. In addition to Minimax [2] and MTDF-f [3], random selection and user selection functions are also provided.

The Minimax move selection function

Minimax works by recursively exploring the game tree, considering each player’s moves and assuming that each player chooses the optimal move to maximize their chances of winning. If we’re scoring a Mancala position using an evaluation function that subtracts the pebbles in Player 2’s store from the pebbles in Player 1’s store, this means that Player 1 will want to maximize the score, while Player 2 will want to minimize it. The diagram below shows how a position is evaluated using the Minimax algorithm.

Each node represents a specific board configuration, and each level of the tree represents a turn of play. The tree nodes are squares on Player 1’s turn and circles on Player 2’s turn. Leaf nodes (nodes without children) are scored using the evaluation function. The rest of the nodes are scored by selecting the best score out of their children nodes - highest score if it’s Player 1’s turn and lowest score if it’s Player 2’s turn.

The Minimax implementation in the library also uses alpha-beta pruning, a technique used to reduce the number of nodes evaluated in the tree by eliminating branches that are guaranteed to be worse than previously examined branches.

A great explanation of Minimax and Alpha-beta prunning can be found in Sebastian Lague’s Youtube video about this algorithm.

The MTD-f move selection function

MTD-f works by repeatedly calling Minimax until it converges to a value. The Minimax used by MTD-f is implemented using alpha-beta pruning.

Since MTD-f calls Minimax several times, it’s also important to use a transposition table, which is a data structure that stores previously evaluated positions and their scores.

Below is Aske Plaat’s pseudo-code for the algorithm.

function MTDF(root : node_type; f : integer; d : integer) : integer; g := f; upperbound := +INFINITY; lowerbound := -INFINITY; repeat if g == lowerbound then beta := g + 1 else beta := g; g := Minimax(root, beta - 1, beta, d); if g < beta then upperbound := g else lowerbound := g; until lowerbound >= upperbound; return g; Evaluating Mancala positions

The static evaluation function used in this library consists of subtracting the pebbles in Player 2’s store from the pebbles in Player 1’s store. This is the same function that was used when solving the Mancala variant Kalah [4].

This way of scoring Mancala positions is particulary suitable for MTD-f, since, according to Plaat, the static evaluation function used should be coarse-grained, meaning that we should avoid using heuristics with little weight. As he says in his post about MTD(f), “The coarser the grain of eval, the less passes MTD(f) has to make to converge to the minimax value. Some programs have a fine grained evaluation function, where positional knowledge can be worth as little as one hundredst of a pawn.” [3].

The MankalaEngine class

The MankalaEngine class ties everything together. When instatiating it, you’ll need to choose a move selection function. It then provides a function, play, that, given the player whose turn it is to play, the rules to use, and the board in which the move will be played, executes the move selected by its move selection function. This allows reusing the common structure of a play across all Mancala variants while deferring variant-specific behaviour to the rules object.

bool MankalaEngine::play(Player player, const Rules& rules, Board& state) const { if (rules.gameOver(player, state)) { const Player winner = player == player_1 ? player_2 : player_1; rules.finishGame(winner, state); return false; } const int move = _selectMove(player, rules, state); rules.move(move, player, state); return true; } Next steps

The idea is to continue adding concrete variant implementations to the library so that developers wanting to create a Mancala game don’t have to implement them themselves. Additionally, adding the option to choose the difficulty of an opponent is also relevant. This may be implemented, for example, by allowing changes to the cutoff depth for the Minimax and MTD-f opponents, which is not currently supported.

As of now, the implemented Minimax only uses alpha-beta prunning and transposition tables. Adding more optimizations, such as move ordering, per example, might also be of interest.

Another possible route is developing a Qt application for playing Mancala that uses this engine. This would likely help generate interest in the project within the broader community.

If you’re interested in this project, you can check it out on Invent and come talk to us on Matrix.

References

1. “Das Bohnenspiel”, Wikipedia, 2023. https://en.wikipedia.org/wiki/Das_Bohnenspiel.

2. “Algorithms - Minimax” https://cs.stanford.edu/people/eroberts/courses/soco/projects/2003-04/intelligent-search/minimax.html.

3. “Aske Plaat: MTD(f), a new chess algorithm.” https://people.csail.mit.edu/plaat/mtdf.html.

4. G. Irving, J. Donkers, and J. Uiterwijk, “Solving Kalah”, Icga journal, vol. 23, no. 3, pp. 139–147, Sep. 2000, doi: 10.3233/ICG-2000-23303.

Categories: FLOSS Project Planets

scikit-learn: Interview with Yao Xiao, scikit-learn Team Member

Planet Python - Wed, 2024-07-17 20:00
Author: Reshama Shaikh , Yao Xiao

Yao Xiao recently earned his undergraduate degree in mathematics and computer science. He will be pursuing a Master’s degree in Computational Science and Engineering at Harvard SEAS. Yao joined the scikit-learn team in February 2024.

  1. Tell us about yourself.

    My name is Yao Xiao and I live in Shanghai, China. At the time of interview I have just got my Bachelor’s degree in Honors Mathematics and Computer Science at NYU Shanghai, and I’m going to pursue a Master’s degree in Computational Science and Engineering at Harvard SEAS. My current research interests are in networks and systems (e.g. sys4ml and ml4sys), but this may change in the future.

  2. How did you first become involved in open source and scikit-learn?

    In my junior year I took a course at NYU Courant called Open Source Software Development where we needed to make contributions to an open source software as our final project - and I chose scikit-learn.

  3. We would love to learn of your open source journey.

    I was lucky to get involved in a pretty easy meta-issue when I first started contributing to scikit-learn. I made quite a few PRs towards that issue, familiarizing myself with the coding standards, contributing workflow etc., and during which I gradually explored the codebase and learned a lot from maintainers how to write better code. After that meta-issue was completed, I decided to continue contributing since I enjoyed the experience, and I started looking through the open issues, tried reproducing and investigating them, then opened PRs for those that I was able to solve. It is the process of familiarizing with more parts of the codebase, being able to make more PRs, so on and so forth. While contributing to scikit-learn, sometimes there are also issues to solve upstream, so I also had opportunities to contribute to projects like pandas and pydata-sphinx-theme. Up till today I’m still far from familiar with the entire scikit-learn project, but I will definitely continue the amazing open-source journey.

  4. To which OSS projects and communities do you contribute?

    I have contributed to scikit-learn, pandas, pydata-sphinx-theme, sphinx-gallery. I’m also writing some small softwares that I decide to make open source.

  5. What do you find alluring about OSS?

    It is amazing to feel that my code is being used by so many people all around the world through contributing to open source projects. Well it might be inappropriate to say “my code”, but I do feel like making some actual contributions to the community instead of just writing code for myself. Also OSS makes me care about code quality and so on instead of merely making things “work”, which is very important for programmers but not really taught in school.

  6. What pain points do you observe in community-led OSS?

    Collaboration can lead to better code but also slows down the development process. Especially when there are not enough reviewers around, issues and PRs can easily get stale or forgotten. But I would say it’s more like a tradeoff rather than a pain point.

  7. If we discuss how far OS has evolved in 10 years, what would you like to see happen?

    I couldn’t say about the past 10 years since I’ve only been involved for about one and a half years, but regarding the scientific Python ecosystem I would like to see better coordination across projects (which is already happening). For instance a common interface for array libraries and dataframe libraries would allow downstream dependents to easily provide more flexible support for different input/output types, etc. And as a Chinese I would also hope that open source can thrive in my country some day as well.

  8. What are your favorite resources, books, courses, conferences, etc?

    As for physical books I would recommend The Pragmatic Programmer by Andy Hunt and Dave Thomas, and Refactoring: Improving the Design of Existing Code by Martin Fowler and Kent Back. As for courses I like MIT’s The Missing Semester of Your CS Education. In particular about learning Python, The Python Tutorial in the official Python documentation is good enough for me. By the way I want to mention that documentations of most languages and popular packages are very nice and they are the best place to learn the most up-to-date information.

  9. What are your hobbies, outside of work and open source?

    I would say my largest hobby is programming (not for school, not for work, just for fun). I’ve recently been fascinated with Tauri and wrote a lot of small desktop applications for myself in my spare time. Apart from this I also love playing the piano and I’m an anime lover, so I often listen to or play piano versions of anime theme songs (mostly arranged by Animenz).

Categories: FLOSS Project Planets

Dries Buytaert: Join the Drupal Starshot team as a track lead

Planet Drupal - Wed, 2024-07-17 19:24

The Drupal Starshot initiative has been making significant progress behind the scenes, and I'm excited to share some updates with the community.

Leadership team formation and product definition

Over the past few months, we've been working diligently on Drupal Starshot. One of our first steps was to appoint a leadership team to guide the project. With the leadership team in place as well as the new Starshot Advisory Council, we shifted our focus to defining the product. We've made substantial progress on this front and will be sharing more details about the product strategy in the coming weeks.

Introducing Drupal Starshot tracks

We already started to break down the initiative into manageable components, and are introducing the concept of "tracks". Tracks are smaller, focused parts of the Drupal Starshot project that allow for targeted development and contributions. We've already published the first set of tracks on the Drupal Starshot issue queue on Drupal.org.

Example tracks include:

  1. Creating Drupal Recipes for features like contact forms, advanced search, events, SEO and more.
  2. Enhancing the Drupal installer to enable Recipes during installation.
  3. Updating Drupal.org for Starshot, including product marketing and a trial experience.

While many tracks are technical and need help from developers, most of the tracks need contribution from designers, UX experts, marketers, testers and site builders.

Recruiting more track leads

Several tracks already have track leads and have made significant progress:

However, we need many additional track leads to drive our remaining tracks to completion.

We're now accepting applications for track lead positions. Interested individuals and organizations can apply by completing our application form. The application window closes on July 31st, two weeks from today.

Key responsibilities of a track lead

Track leads can be individuals, teams, or organizations, including Drupal Certified Partners. While technical expertise is beneficial, the role primarily focuses on strategic coordination and project management. Key responsibilities include:

  • Defining and validating requirements to ensure the track meets the expectations of our target audience.
  • Developing and maintaining a prioritized task list, including creating milestones and timelines.
  • Overseeing and driving the track's implementation.
  • Collaborating with key stakeholders, including the Drupal Starshot leadership team, module maintainers, the marketing team, etc.
  • Communicating progress to the community (e.g. blogging).
Track lead selection and announcement

After the application deadline, the Drupal Starshot Leadership Team will review the applications and appoint track leads. We expect to announce the selected track leads in the first week of August.

While the application period is open, we will be available to answer any questions you may have. Feel free to reach out to us through the Drupal.org issue queue, or join us in an upcoming zoom meeting (details to be announced / figured out).

Looking ahead to DrupalCon Barcelona

Our goal is to make significant progress on these tracks by DrupalCon Barcelona, where we plan to showcase the advancements we've made. We're excited about the momentum building around Drupal Starshot and can't wait to see the contributions from the community.

If you're passionate about Drupal and want to play a key role in shaping its future, consider applying for a track lead position.

Stay tuned for more updates on Drupal Starshot, and thank you for your continued support of the Drupal community.

Categories: FLOSS Project Planets

Dirk Eddelbuettel: Rcpp 1.0.13 on CRAN: Some Updates

Planet Debian - Wed, 2024-07-17 17:50

The Rcpp Core Team is once again pleased to announce a new release (now at 1.0.13) of the Rcpp package. It arrived on CRAN earlier today, and has since been uploaded to Debian. Windows and macOS builds should appear at CRAN in the next few days, as will builds in different Linux distribution–and of course r2u should catch up tomorrow too. The release was uploaded last week, but not only does Rcpp always gets flagged because of the grandfathered .Call(symbol) but CRAN also found two packages ‘regressing’ which then required them to take five days to get back to us. One issue was known; another did not reproduce under our tests against over 2800 reverse dependencies leading to the eventual release today. Yay. Checks are good and appreciated, and it does take time by humans to review them.

This release continues with the six-months January-July cycle started with release 1.0.5 in July 2020. As a reminder, we do of course make interim snapshot ‘dev’ or ‘rc’ releases available via the Rcpp drat repo as well as the r-universe page and repo and strongly encourage their use and testing—I run my systems with these versions which tend to work just as well, and are also fully tested against all reverse-dependencies.

Rcpp has long established itself as the most popular way of enhancing R with C or C++ code. Right now, 2867 packages on CRAN depend on Rcpp for making analytical code go faster and further, along with 256 in BioConductor. On CRAN, 13.6% of all packages depend (directly) on Rcpp, and 59.9% of all compiled packages do. From the cloud mirror of CRAN (which is but a subset of all CRAN downloads), Rcpp has been downloaded 86.3 million times. The two published papers (also included in the package as preprint vignettes) have, respectively, 1848 (JSS, 2011) and 324 (TAS, 2018) citations, while the the book (Springer useR!, 2013) has another 641.

This release is incremental as usual, generally preserving existing capabilities faithfully while smoothing our corners and / or extending slightly, sometimes in response to changing and tightened demands from CRAN or R standards. The move towards a more standardized approach for the C API of R leads to a few changes; Kevin did most of the PRs for this. Andrew Johnsom also provided a very nice PR to update internals taking advantage of variadic templates.

The full list below details all changes, their respective PRs and, if applicable, issue tickets. Big thanks from all of us to all contributors!

Changes in Rcpp release version 1.0.13 (2024-07-11)
  • Changes in Rcpp API:

    • Set R_NO_REMAP if not already defined (Dirk in #1296)

    • Add variadic templates to be used instead of generated code (Andrew Johnson in #1303)

    • Count variables were switches to size_t to avoid warnings about conversion-narrowing (Dirk in #1307)

    • Rcpp now avoids the usage of the (non-API) DATAPTR function when accessing the contents of Rcpp Vector objects where possible. (Kevin in #1310)

    • Rcpp now emits an R warning on out-of-bounds Vector accesses. This may become an error in a future Rcpp release. (Kevin in #1310)

    • Switch VECTOR_PTR and STRING_PTR to new API-compliant RO variants (Kevin in #1317 fixing #1316)

  • Changes in Rcpp Deployment:

    • Small updates to the CI test containers have been made (#1304)

Thanks to my CRANberries, you can also look at a diff to the previous release Questions, comments etc should go to the rcpp-devel mailing list off the R-Forge page. Bugs reports are welcome at the GitHub issue tracker as well (where one can also search among open or closed issues).

If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Categories: FLOSS Project Planets

Kalyani Kenekar: Securing Your Website: Installing and Configuring Nginx with SSL

Planet Debian - Wed, 2024-07-17 14:30

The Initial Encounter:

I recently started to work with Nginx to explore the requirements on how to configure a then so called server block. It’s quite different than within Apache. But there are a tons of good websites out there which do explain the different steps and options quite well. I also realized quickly that I need to be able to configure my Nginx setups in a way so the content is delivered through https with some automatic redirection from http URLs.

  • Let’s install Nginx
Installing Nginx $ sudo apt update $ sudo apt install nginx Checking your Web Server
  • We can check now nginx service is active or inactive
Output ● nginx.service - A high performance web server and a reverse proxy server Loaded: loaded (/lib/systemd/system/nginx.service; enabled; vendor preset: enabled) Active: active (running) since Tue 2024-02-12 09:59:20 UTC; 3h ago Docs: man:nginx(8) Main PID: 2887 (nginx) Tasks: 2 (limit: 1132) Memory: 4.2M CPU: 81ms CGroup: /system.slice/nginx.service ├─2887 nginx: master process /usr/sbin/nginx -g daemon on; master_process on; └─2890 nginx: worker process
  • Now we successfully installed nginx and it in running state.
How To Secure Nginx with Let’s Encrypt on Debian 12
  • In this documentation, you will use Certbot to obtain a free SSL certificate for Nginx on Debian 12 and set up your certificate.
Step 1 — Installing Certbot

$ sudo apt install certbot python3-certbot-nginx

  • Certbot is now ready to use, but in order for it to automatically configure SSL for Nginx, we need to verify some of Nginx’s configuration.
Step 2 — Confirming Nginx’s Configuration
  • Certbot needs to be able to find the correct server block in your Nginx configuration for it to be able to automatically configure SSL. Specifically, it does this by looking for a server_name directive that matches the domain you request a certificate for. To check, open the configuration file for your domain using nano or your favorite text editor.

$ sudo vi /etc/nginx/sites-available/example.com

server { listen 80; root /var/www/html/; index index.html; server_name example.com location / { try_files $uri $uri/ =404; } location /test.html { try_files $uri $uri/ =404; auth_basic "admin area"; auth_basic_user_file /etc/nginx/.htpasswd; } }
  • Fillup above data your project wise and then save the file, quit your editor, and verify the syntax of your configuration edits.

$ sudo nginx -t

Step 3 — Obtaining an SSL Certificate
  • Certbot provides a variety of ways to obtain SSL certificates through plugins. The Nginx plugin will take care of reconfiguring Nginx and reloading the config whenever necessary. To use this plugin, type the following command line.

$ sudo certbot --nginx -d example.com

  • The configuration will be updated, and Nginx will reload to pick up the new settings. certbot will wrap up with a message telling you the process was successful and where your certificates are stored.
Output IMPORTANT NOTES: - Congratulations! Your certificate and chain have been saved at: /etc/letsencrypt/live/example.com/fullchain.pem Your key file has been saved at: /etc/letsencrypt/live/example.com/privkey.pem Your cert will expire on 2024-05-12. To obtain a new or tweaked version of this certificate in the future, simply run certbot again with the "certonly" option. To non-interactively renew *all* of your certificates, run "certbot renew" - If you like Certbot, please consider supporting our work by: Donating to ISRG / Let's Encrypt: https://letsencrypt.org/donate Donating to EFF: https://eff.org/donate-le
  • Your certificates are downloaded, installed, and loaded. Next check the syntax again of your configuration.

$ sudo nginx -t

  • If you get an error, reopen the server block file and check for any typos or missing characters. Once your configuration file’s syntax is correct, reload Nginx to load the new configuration.

$ sudo systemctl reload nginx

  • Try reloading your website using https:// and notice your browser’s security indicator. It should indicate that the site is properly secured, usually with a lock icon.

Now your website is secured by SSL usage.

Categories: FLOSS Project Planets

roose.digital: This is how you redirect all visitors to the HTTPS version of your Drupal website with or without WWW

Planet Drupal - Wed, 2024-07-17 14:25
Redirecting all your traffic from HTTP to HTTPS with or without WWW using Drupal can be a complex and frustrating task. Fortunately, there is a relatively simple solution that can help you achieve this in just a few steps.
Categories: FLOSS Project Planets

Lullabot: The Easy Guide to Resolving composer.lock Conflicts

Planet Drupal - Wed, 2024-07-17 14:08

Resolving merge conflicts is a great advanced part of Composer’s documentation, but we’ve found that it leaves many readers more confused than confident. If there is a composer.lock conflict, all you get is this error message:

Categories: FLOSS Project Planets

Gunnar Wolf: Script for weather reporting in Waybar

Planet Debian - Wed, 2024-07-17 13:32

While I was living in Argentina, we (my family) found ourselves checking for weather forecasts almost constantly — weather there can be quite unexpected, much more so that here in Mexico. So it took me a bit of tinkering to come up with a couple of simple scripts to show the weather forecast as part of my Waybar setup. I haven’t cared to share with anybody, as I believe them to be quite trivial and quite dirty.

But today, Víctor was asking for some slightly-related things, so here I go. Please do remember I warned: Dirty.

I am using OpenWeather’s open API. I had to register to get an APPID, and it allows me for up to 1,000 API calls per day, more than plenty for my uses, even if I am logged in at my desktops at three different computers (not an uncommon situation). Having that, I set up a file named /etc/get_weather/, that currently reads:

# Home, Mexico City LAT=19.3364 LONG=-99.1819 # # Home, Paraná, Argentina # LAT=-31.7208 # LONG=-60.5317 # # PKNU, Busan, South Korea # LAT=35.1339 #LONG=129.1055 APPID=SomeLongRandomStringIAmNotSharing

Then, I have a simple script, /usr/local/bin/get_weather, that fetches the current weather and the forecast, and stores them as /run/weather.json and /run/forecast.json:

#!/usr/bin/bash CONF_FILE=/etc/get_weather if [ -e "$CONF_FILE" ]; then . "$CONF_FILE" else echo "Configuration file $CONF_FILE not found" exit 1 fi if [ -z "$LAT" -o -z "$LONG" -o -z "$APPID" ]; then echo "Configuration file must declare latitude (LAT), longitude (LONG) " echo "and app ID (APPID)." exit 1 fi CURRENT=/run/weather.json FORECAST=/run/forecast.json wget -q "https://api.openweathermap.org/data/2.5/weather?lat=${LAT}&lon=${LONG}&units=metric&appid=${APPID}" -O "${CURRENT}" wget -q "https://api.openweathermap.org/data/2.5/forecast?lat=${LAT}&lon=${LONG}&units=metric&appid=${APPID}" -O "${FORECAST}"

This script is called by the corresponding systemd service unit, found at /etc/systemd/system/get_weather.service:

[Unit] Description=Get the current weather [Service] Type=oneshot ExecStart=/usr/local/bin/get_weather

And it is run every 15 minutes via the following systemd timer unit, /etc/systemd/system/get_weather.timer:

[Unit] Description=Get the current weather every 15 minutes [Timer] OnCalendar=*:00/15:00 Unit=get_weather.service [Install] WantedBy=multi-user.target

(yes, it runs even if I’m not logged in, wasting some of my free API calls… but within reason)

Then, I declare a "custom/weather" module in the desired position of my ~/.config/waybar/waybar.config, and define it as:

"custom/weather": { "exec": "while true;do /home/gwolf/bin/parse_weather.rb;sleep 10; done", "return-type": "json", },

This script basically morphs a generic weather JSON description into another set of JSON bits that display my weather in the way I prefer to have it displayed as:

#!/usr/bin/ruby require 'json' Sources = {:weather => '/run/weather.json', :forecast => '/run/forecast.json' } Icons = {'01d' => '🌞', # d → day '01n' => '🌃', # n → night '02d' => '🌤️', '02n' => '🌥', '03d' => '☁️', '03n' => '🌤', '04d' => '☁️', '04n' => '🌤', '09d' => '🌧️', '10n' => '🌧 ', '10d' => '🌦️', '13d' => '❄️', '50d' => '🌫️' } ret = {'text': nil, 'tooltip': nil, 'class': 'weather', 'percentage': 100} # Current weather report: Main text of the module begin weather = JSON.parse(open(Sources[:weather],'r').read) loc_name = weather['name'] icon = Icons[weather['weather'][0]['icon']] || '?' + f['weather'][0]['icon'] + f['weather'][0]['main'] temp = weather['main']['temp'] sens = weather['main']['feels_like'] hum = weather['main']['humidity'] wind_vel = weather['wind']['speed'] wind_dir = weather['wind']['deg'] portions = {} portions[:loc] = loc_name portions[:temp] = '%s 🌡%2.2f°C (%2.2f)' % [icon, temp, sens] portions[:hum] = '💧 %2d%%' % hum portions[:wind] = '🌬%2.2fm/s %d°' % [wind_vel, wind_dir] ret['text'] = [:loc, :temp, :hum, :wind].map {|p| portions[p]}.join(' ') rescue => err ret['text'] = 'Could not process weather file (%s ⇒ %s: %s)' % [Sources[:weather], err.class, err.to_s] end # Weather prevision for the following hours/days begin cast = [] forecast = JSON.parse(open(Sources[:forecast], 'r').read) min = '' max = '' day=Time.now.strftime('%Y.%m.%d') by_day = {} forecast['list'].each_with_index do |f,i| by_day[day] ||= [] time = Time.at(f['dt']) time_lbl = '%02d:%02d' % [time.hour, time.min] icon = Icons[f['weather'][0]['icon']] || '?' + f['weather'][0]['icon'] + f['weather'][0]['main'] by_day[day] << f['main']['temp'] if time.hour == 0 min = '%2.2f' % by_day[day].min max = '%2.2f' % by_day[day].max cast << ' ↑ min: <b>%s°C</b> max: <b>%s°C</b>' % [min, max] day = time.strftime('%Y.%m.%d') cast << ' ┍━━━━━┫ <b>%04d.%02d.%02d</b> ┠━━━━━┑' % [time.year, time.month, time.day] end cast << '%s | %2.2f°C | 🌢%2d%% | %s %s' % [time_lbl, f['main']['temp'], f['main']['humidity'], icon, f['weather'][0]['description'] ] end cast << ' ↑ min: <b>%s</b>°C max: <b>%s°C</b>' % [min, max] ret['tooltip'] = cast.join("\n") rescue => err ret['tooltip'] = 'Could not process forecast file (%s ⇒ %s)' % [Sources[:forecast], err.class, err.to_s] end # Print out the result for Waybar to process puts ret.to_json

The end result? Nothing too stunning, but definitively something I find useful and even nicely laid out:

Do note that it seems OpenWeather will return the name of the closest available meteorology station with (most?) recent data — for my home, I often get Ciudad Universitaria, but sometimes Coyoacán or even San Ángel Inn.

Categories: FLOSS Project Planets

Tag1 Consulting: Migrating Your Data from D7 to D10: The migration process pipeline

Planet Drupal - Wed, 2024-07-17 11:23

Series Overview & ToC | Previous Article | Next Article - coming July 24th --- Our last article explored the syntax and structure of migration files. Today, we are diving deeper into the most important part of a migration: the process pipeline. This determines how source data will be processed and transformed to match the expected destination structure. We will learn how to configure and chain process plugins, how to set subfields and deltas for multi-value fields, and to work with source constants and pseudo-fields. Let’s get started. ## From source to destination The process section in a migration is responsible for transforming data as extracted from the source into a format that the destination expects. The collection of all those data transformations is known as the migration process pipeline. The Migrate API is a generic ETL framework. This means the source data can come from different types of sources like a database table; a CSV, JSON, or XML file; a remote API using JSON:API or GraphQL; or something else. The destination can be as diverse including databases, text files, and remote APIs. Because the series focuses on migrating from Drupal 7 to 10, most of our discussion will revolve...

Read more mauricio Wed, 07/17/2024 - 08:23
Categories: FLOSS Project Planets

Real Python: Python Protocols: Leveraging Structural Subtyping

Planet Python - Wed, 2024-07-17 10:00

In Python, a protocol specifies the methods and attributes that a class must implement to be considered of a given type. Protocols are important in Python’s type hint system, which allows for static type checking through external tools, such as mypy, Pyright, and Pyre.

Before there were protocols, these tools could only check for nominal subtyping based on inheritance. There was no way to check for structural subtyping, which relies on the internal structure of classes. This limitation affected Python’s duck typing system, which allows you to use objects without considering their nominal types. Protocols overcome this limitation, making static duck typing possible.

In this tutorial, you’ll:

  • Gain clarity around the use of the term protocol in Python
  • Learn how type hints facilitate static type checking
  • Learn how protocols allow static duck typing
  • Create custom protocols with the Protocol class
  • Understand the differences between protocols and abstract base classes

To get the most out of this tutorial, you’ll need to know the basics of object-oriented programming in Python, including concepts such as classes and inheritance. You should also know about type checking and duck typing in Python.

Get Your Code: Click here to download the free sample code that shows you how to leverage structural subtyping with Python protocols

The Meaning of “Protocol” in Python

During Python’s evolution, the term protocol became overloaded with two subtly different meanings. The first meaning refers to internal protocols, such as the iterator, context manager, and descriptor protocols.

These protocols are widely understood in the community and consist of special methods that make up a given protocol. For example, the .__iter__() and .__next__() methods define the iterator protocol.

Python 3.8 introduced a second, slightly different type of protocol. These protocols specify the methods and attributes that a class must implement to be considered of a given type. So, these protocols also have to do with a class’s internal structure.

With this kind of protocol, you can define interchangeable classes as long as they share a common internal structure. This feature allows you to enforce a relationship between types or classes without the burden of inheritance. This relationship is known as structural subtyping or static duck typing.

In this tutorial, you’ll focus on this second meaning of the term protocol. First, you’ll have a look at how Python manages types.

Dynamic and Static Typing in Python

Python is a dynamically typed language, which means that the Python interpreter checks an object’s type when the code runs. It also means that while a variable can only reference one object at a time, the type of that object can change during the variable’s lifetime.

For example, you can have a variable that starts as a string and changes into an integer number:

Python >>> value = "One hundred" >>> value 'One hundred' >>> value = 100 >>> value 100 Copied!

In this example, you have a variable that starts as a string. Later in your code, you change the variable’s value to an integer.

Because of its dynamic nature, Python has embraced a flexible typing system that’s known as duck typing.

Duck Typing

Duck typing is a type system in which an object is considered compatible with a given type if it has all the methods and attributes that the type requires. This typing system supports the ability to use objects of independent and decoupled classes in a specific context as long as they adhere to some common interface.

Note: To dive deeper into duck typing, check out the Duck Typing in Python: Writing Flexible and Decoupled Code tutorial.

As an example of duck typing, you can consider built-in container data types, such as lists, tuples, strings, dictionaries, and sets. All of these data types support iteration:

Python >>> numbers = [1, 2, 3] >>> person = ("Jane", 25, "Python Dev") >>> letters = "abc" >>> ordinals = {"one": "first", "two": "second", "three": "third"} >>> even_digits = {2, 4, 6, 8} >>> containers = [numbers, person, letters, ordinals, even_digits] >>> for container in containers: ... for element in container: ... print(element, end=" ") ... print() ... 1 2 3 Jane 25 Python Dev a b c one two three 8 2 4 6 Copied!

In this code snippet, you define a few variables using different built-in types. Then, you start a for loop over the collections and iterate over each of them to print their elements to the screen. Even though the built-in types are significantly different from one another, they all support iteration.

The duck typing system allows you to create code that can work with different objects, provided that they share a common interface. This system allows you to set relationships between classes that don’t rely on inheritance, which produces flexible and decoupled code.

Read the full article at https://realpython.com/python-protocol/ »

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Real Python: Quiz: How Do You Choose Python Function Names?

Planet Python - Wed, 2024-07-17 08:00

In this quiz, you’ll test your understanding of how to choose Python function names.

By working through this quiz, you’ll revisit the rules and conventions for naming Python functions and why they’re important for writing Pythonic code.

Choosing the ideal Python function names makes your code more readable and easier to maintain. Code with well-chosen names can also be less prone to bugs.

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Anwesha Das: Looking back to Euro Python 2024

Planet Python - Wed, 2024-07-17 07:42

Over the years, when  I am low, I always go to the 2014 Euro Python talk  "Farewell and Welcome Home: Python in Two Genders" by Naomi. It has become the first step of my coping mechanism and the door to my safe house. Though 2024 marked my Euro Python journey in person, I had a long connection and respect for the conference. A conference that believes community matters, human values and feelings matter, and not afraid to walk the talk. And how the conference stood up to my expectations in every bit.

My Talk: Intellectual Property Law 101

I had my talk on Intellectual Property Law, on the first day. After a long time, I was giving a talk on the legal topic. This talk was dedicated to the developers. So, I concentrated on only those issues which concerned the developers. Tried to stitch the concerned topics Patent, Trademarks, and Copyright together. For the smooth flow of the talk, since it becomes easier for the developers to understand and remember for all the practical purposes for future use. I was concerned if I would be able to connect with people. Later, people came to  me with several related questions, starting from

  • Why should I be concerned about patents?

  • Which license would fit my project?

  • Should I be scared about any Trademarks granted to other organizations under some other jurisdiction?

So on and so forth. Though I could not finish the whole talk due to time constraints, I am happy with the overall review.

Panel: Open Source Sustainability

On Day 1 of the main conference, we had the panel on Open Source Sustainability. This topic lies at the core of open-source ecosystem sustainability for the projects and community for the future and stability. The panel had Deb Nicholson, Armin Ronacher Ça&#x11F;&#x131;l Ulu&#x15F;ahin Sönmez,Deb Nicholson, Samuel Colvin, and me and Artur Czepiel as  the moderator.  I was happy to represent my community&aposs side. It was a good discussion, and hopefully, we could give answers to some questions of the community in general.

Birds of Feather session: Open Source Release Management

This Birds of Feathers (BoF) session is intended to deal with the Release Management of various Open Source projects, irrespective of their size. The discussion includes all projects, from a community-led project to projects maintained/initiated by big enterprises, from a project maintained by one contributor to a project with several hundred contributors.

  • What methods do we follow regarding versioning, release cadence, and the process?

  • Do most of us follow manual processes or depend on automated ones?

  • What works and what does not, and how can we improve our lives?

  • What are the significant points that make the difference?

We discussed and covered the following topics: different aspects of release management of Open-Source projects, security, automation, CI usage, and documentation. We followed the Chatham House Rules during the discussion to provide the space for open, frank, and collaborative conversation.

PyLadies Lunch

And then comes my favorite part of the conference: PyLadies Lunch. It was my seventh PyLadies lunch, and I was moderating it for the fifth time. But this time, my wonderful friends [Laís] and Ça&#x11F;&#x131;l were by my side, holding me up when I failed. I love every time I am at a PyLadies lunch. This is where I get my strength, energy, and love.

Workshop

I attended two workshops organized by Anezka Muller , Mia Baji&#x107; and all amazing PyLadies organizers

  • Self-defense workshop where the moderators helped us navigate challenging situations we face in life, safeguard ourselves from them, and overcome them.

  • I AM Remarkable workshop, where we learned to tell people about our successes.

Representing Ansible Community

I always take the chance to meet the Ansible community members face-to-face. Euro Python gave me another opportunity to do that. I learned about different user stories that we do not get to hear from our work corners, and I learned about these unique problems and their solutions in Ansible. 
Fun fact : Maarten gave a review after knowing I am Anwesha from the Ansible project. He said, &aposCan you Ansible people slow down in releasing new versions of Ansible? Every time we get used to it, we have a new version.&apos

Acknowledging mental health issues

The proudest moment for me personally was when I acknowledged my mental health issues and later when people came to me saying how they relate to me and how they felt empowered when I mentioned this.

PyLadies network at Red Hat

A network of PyLadies within Red Hat has been my dream since I joined Red Hat. She also agreed when I shared this with Karolina at last year&aposs DevConf. And finally, we initiated on day 2 of the conference. We are so excited for the future to come.

Meeting friends

Conference means friends. It was so great to meet so many friends after such a long time Tylor, Nicholas, Naomi, Honza, Carol, Mike, Artur, Nikita, Valerio and many new ones Jannis Joana,[Chirstian], Martina Tereza , Maria, Alyona, Mia, Naa , Bojanand Jodie. A special note of love to Jodie, you to hold my hand and take me out of the dark.

The best is saved for the last. Euro Python 2024 made 3 of my dreams come true.

  • Gender Neutral Washrooms

  • Sanitary products in restrooms (I remember carrying sanitary napkins in my bag pack in PyCon India and telling girls if they needed it, it was available in the PyLadies booth).

  • Neo-diversity bag (which saved me at the conference; thank you, Karolina, for this)

I cannot wait for the next Euro Python; see you all at Euro Python 2025.

PS: Thanks to Lias, I will always have a small piece of Euro Python 2024 with me. I know I am loved and cared for.

Categories: FLOSS Project Planets

Python Software Foundation: Announcing the 2024 PSF Board Election &amp; Proposed Bylaw Change Results!

Planet Python - Wed, 2024-07-17 07:11

The 2024 election for the PSF Board and proposed Bylaws changes created an opportunity for conversations about the PSF's work to serve the global Python community. We appreciate community members' perspectives, passion, and engagement in the election process this year.

We want to send a big thanks to everyone who ran and was willing to serve on the PSF Board. Even if you were not elected, we appreciate all the time and effort you put into thinking about how to improve the PSF and represent the parts of the community you participate in. We hope that you will continue to think about these issues, share your ideas, and join a PSF Work Group if you feel called to do so.

Board Members Elect

Congratulations to our three new Board members who have been elected!

  • Tania Allard
  • KwonHan Bae
  • Cristián Maureira-Fredes

We’ll be in touch with all the elected candidates shortly to schedule onboarding. Newly elected PSF Board members are provided orientation for their service and will be joining the upcoming board meeting.

PSF Bylaw Changes

All three of the proposed PSF Bylaw changes are approved:

We appreciate the high level of engagement on the proposed Bylaw changes, and the range of perspectives and points that were raised. We hope that our efforts towards increased transparency, such as the Office Hour session, and our responses in the FAQ helped to continue to build trust with the community. Our goal with these changes continues to be:

  • Making it simpler to qualify as a Member for Python-related volunteer work
  • Making it easier to vote
  • Allowing the Board more options to keep our membership safe and enforce the Code of Conduct

This announcement serves as notice that the Bylaws changes have been approved by the membership, and will automatically go into effect 15 days from now, on Thursday, August 1st, 2024.

Thank you!

We’d like to take this opportunity to thank our outgoing board member, Débora Azevedo, for her outstanding service. Débora served on the PSF Board through a particularly eventful time; bringing PyCon US into an age of hybrid events, responding to calls from our community for transparency, and hiring multiple new staff members to continue to improve our organization. Thank you for supporting the PSF and the Python community through so much change- you are appreciated!

Our heartfelt thanks go out to each of you who took the time to review the candidates and submit your votes. Your participation helps the PSF represent our community. We received 611 total votes, easily reaching quorum–1/3 of affirmed voting members (794). We’re especially grateful for your patience with continuing to navigate the changes to the voting process, which allows for a valid election and a more sustainable election system.

We also want to thank everyone who helped promote this year’s board election, especially Board Members Denny Perez and Georgi Ker, who took the initiative to cover this year’s election and produced informational videos for our candidates. This promotional effort was inspired by the work of Python Community News last year. We also want to highlight the PSF staff members and PSF Board members who put in tons of effort each year as we work to continually improve the PSF elections.

What’s next?

If you’re interested in the complete tally, make sure to check the Python Software Foundation Board of Directors Election 2024 Results page. These results will be available until 10 Sep 2024 at 10:00 AM EDT.

The PSF Election team will conduct a retrospective of this year’s election process to ensure we are improving year over year. We received valuable feedback about the process and tooling. We hope to be able to implement changes for next year to ensure a smooth and accessible election process for everyone in our community.

Finally, it might feel a little early to mention this, but we will have at least 4 seats open again next year. If you're interested in running or learning more, we encourage you to contact a current PSF Board member or two this year and ask them about their experience serving on the board.

Categories: FLOSS Project Planets

health @ Savannah: MyGNUHealth 2.2.1 released

GNU Planet! - Wed, 2024-07-17 06:10

Dear community

I am happy to announce patchset 2.2.1 for MYGNUHealth, the GNU Health Personal Health Record.

This patchset fixes the following issues:


You can download MyGNUHealth source code from the official GNU Savannah (https://ftp.gnu.org/gnu/health/mygnuhealth/). You can also install MyGH from the Python Package Index (PyPI) or from your operating system distribution.

Happy hacking
Luis

Categories: FLOSS Project Planets

Mike Gabriel: Weather Experts with Translation Skills Needed!

Planet Debian - Wed, 2024-07-17 06:05
Lomiri Weather App goes Open Meteo

In Ubuntu Touch / Lomiri, Maciej Sopyło has updated Lomiri's Weather App to operate against a different weather forecast provider (Open Meteo). Additionally, the new implementation is generic and pluggable, so other weather data providers can be added-in later.

Big thanks to Maciej for working on this just in time (the previous implementation's API has recently been EOL'ed and is not available anymore to Ubuntu Touch / Lomiri users).

Lomiri Weather App - new Meteorological Terms part of the App now

While the old weather data provider implementation obtained all the meteorological information as already localized strings from the provider, the new implementation requires all sorts of weather conditions being translated within the Lomiri Weather App itself.

The meteorological terms are probably not easy to translate for the usual software translator, so special help might be required here.

Call for Translations: Lomiri Weather App

So, if you feel entitled to help here, please join the Hosted Weblate service [1] and start working on Lomiri Weather App.

Thanks a lot!

light+love
Mike Gabriel (aka sunweaver)

[1] https://hosted.weblate.org/
[2] https://hosted.weblate.org/projects/lomiri/lomiri-weather-app/

Categories: FLOSS Project Planets

mark.ie: Keyboard Navigation for a LocalGov Drupal website

Planet Drupal - Wed, 2024-07-17 05:27

Wouldn't it be cool if we could get around our LocalGov Drupal websites by using keyboard shortcuts?

Categories: FLOSS Project Planets

Akademy 2024 Program Now Live

Planet KDE - Wed, 2024-07-17 04:41

The Akademy 2024 Program is now available.

This year's Akademy will take place in Würzburg, a beautiful city where you can enjoy interesting and fascinating talks, panels and keynotes. And for those who prefer to participate remotely, Akademy will also be available online.

Akademy officially kicks off with a welcome event on Friday 6 September, followed by a series of talks on Saturday 7 September and Sunday 8 September. From Monday 9 to Thursday 12 September, there will be BoFs (Birds of a Feather), workshops, meetings, daytrip and training sessions.

The talks will cover KDE's goals, how we're doing with implementing other languages to code for KDE (Rust anyone?), what's new in the latest wave of desktop and mobile applications, how KDE Eco is saving the environment, backends, frontends, KDE for work, life and fun.

For example, Nicolas Fella will tell us what a software maintainer does and why they are crucial to a project's survival, Aleix Pol Gonzalez will demystify embedded Linux, and Kevin Ottens will take us deep into the core of KDE Neon. You will also learn more about Plasma Mobile, funding your dream project and cool new KWin effects.

You can expect much, much more from a schedule packed with exciting talks and eye-opening presentations. Just take a look at the full program to discover everything that will be happening.

And that is not all! Stay tuned for the announcement of our two keynote speakers, coming soon here on Planet.

During the week KDE community members will attend BoFs and meet with colleagues with similar interests to work on their projects. They will also attend workshops, meetings, training sessions and daytrip until the event closes on 12 September.

Categories: FLOSS Project Planets

Pages