PyBites: Leveraging typing.Protocol: Faster Error Detection And Beyond Inheritance

Planet Python - Fri, 2024-02-09 10:52

Two weeks ago I wrote an article about ABCs and interface enforcement. Shortly after I learned that you can do this as well with protocols.

Python 3.8 introduced quite a groundbreaking feature that further advanced the language’s capabilities in type checking: the typing.Protocol which allows Python developers to define and enforce interface contracts in their code.

Unlike traditional dynamic typing, which Python is well-known for, typing.Protocol brings a layer of static type checking that allows for more explicit, readable, and robust code by defining expected behaviors and structures.

This feature not only enhances code quality and developer productivity but also aligns Python more closely with the benefits seen in statically-typed languages, all while maintaining Python’s expressive and dynamic nature.

In this article, we delve into the potential of typing.Protocol through the lens of refactoring the Pybites Search feature, showcasing the immediate benefits of early error detection.

Before we dive in, in case you missed last article, let’s first clarify what “enforcing an interface” actually means …

What is interface enforcement?

Interface enforcement in object-oriented programming (OOP) is a design principle where you define a contract or a blueprint for what methods and properties should be present in a class.

This contract ensures that all classes implementing the interface adhere to a specific structure, promoting consistency and predictability across your codebase.

For example, if you define an interface for a Search functionality, every class implementing this interface must provide the specific methods defined, such as match_content.

This approach allows programmers to design components that can interact seamlessly, knowing the expected behaviors and data types, thereby reducing errors and improving code maintainability.

The ABC / Abstract Method Way

Remember how we implemented PyBites Search originally using Abstract Base Classes (ABCs) to enforce a consistent interface for search functionality:

from abc import ABC, abstractmethod class PybitesSearch(ABC): @abstractmethod def match_content(self, search: str) -> list: # >= 3.9 you can use list over typing.List ...

While ABCs served us well here, the interface enforcement only happens at runtime.

Static type checking vs runtime checks

Static type checking offers several advantages over runtime checks, primarily by shifting error detection earlier in the development process.

By identifying type mismatches and potential bugs during coding or at compile time, developers can address issues before they manifest in a running application, leading to more stable and reliable software.

Static type checking also enhances code readability and documentation, as type annotations provide clear expectations for variables, arguments, and return types.

This explicitness improves developer collaboration and facilitates easier maintenance and debugging of the codebase.

Moreover, static type checking can lead to performance optimizations since the interpreter or compiler can make certain assumptions about the types, potentially streamlining execution.

If you are new to Python type hints, check out our article.

Detecting Errors Earlier

Although ABCs and abstract methods enforce an interface, one limitation is that errors can slip through into production.

What if we can detect interface contract breaches earlier?

Enter typing.Protocol! This new feature allows for static type checking, ensuring that not only our class implements the enforced interface, it also does it with the right method signatures.

Here’s how we could refactor Pybites Search to use this (code):

from typing import Protocol class PybitesSearchProtocol(Protocol): def match_content(self, search: str) -> list[str]: """Implement in subclass to search Pybites content""" ... class CompleteSearch: def match_content(self, search: str) -> list[str]: # Implementation of search method return ["result1", "result2"] class IncompleteSearch: # Notice that we don't implement match_content here pass def perform_search(search_tool: PybitesSearchProtocol, query: str) -> None: results = search_tool.match_content(query) print(results) # Static type checking will pass for CompleteSearch perform_search(CompleteSearch(), "Python") # Static type checking will fail for IncompleteSearch perform_search(IncompleteSearch(), "Python")

In this refactored version, PybitesSearchProtocol defines an interface using typing.Protocol.

Unlike ABCs, classes like CompleteSearch and IncompleteSearch don’t need to inherit from PybitesSearchProtocol to be considered compliant.

Instead, compliance is determined by whether a class implements the required methods, in this case, match_content.

When the perform_search function is called with an instance of CompleteSearch, static type checkers like mypy will confirm that CompleteSearch satisfies the PybitesSearchProtocol because it implements match_content.

However, passing an instance of IncompleteSearch, which lacks the match_content method, will result in a type-checking error.

This approach, using typing.Protocol, offers a more flexible way of enforcing interfaces, particularly useful in scenarios where rigid class hierarchies are undesirable or unnecessary.

It aligns well with Python’s dynamic nature while still leveraging the benefits of static type checking to ensure code correctness.

Static Type Checking

Now let’s see how this protocol is enforced as I try to implement.

Step 1: Missing Method Implementation

Initially, when a class like IncompleteSearch does not implement the required match_content method.

class IncompleteSearch: pass

Failing to implement a required method could lead to AttributeError exceptions at runtime, disrupting user experience and potentially halting application functionality.

Running mypy, it catches this error:

$ mypy script.py script.py:24: error: Argument 1 to "perform_search" has incompatible type "IncompleteSearch"; expected "PybitesSearchProtocol" [arg-type] Found 1 error in 1 file (checked 1 source file) Step 2: Incorrect Method Signature

Next we implement the method but with an incorrect signature:

class IncompleteSearch: def match_content(self): pass

Implementing a method with the wrong signature may result in a TypeError that is only caught when the specific code path is executed, risking data inconsistencies or application crashes in production environments.

Mypy catches this too:

$ mypy script.py script.py:25: error: Argument 1 to "perform_search" has incompatible type "IncompleteSearch"; expected "PybitesSearchProtocol" [arg-type] script.py:25: note: Following member(s) of "IncompleteSearch" have conflicts: script.py:25: note: Expected: script.py:25: note: def match_content(self, search: str) -> list[str] script.py:25: note: Got: script.py:25: note: def match_content(self) -> Any Found 1 error in 1 file (checked 1 source file) Step 3: Incompatible Return Type

Finally, we correct the method signature but we return an incorrect type.

class IncompleteSearch: def match_content(self, search: str) -> list[str]: return (1, 2)

Returning incorrect types from methods can lead to subtle bugs, such as incorrect data processing or application logic failures, which may not be immediately apparent and could lead to significant issues over time.

Once again, mypy flags it:

$ mypy script.py script.py:15: error: Incompatible return value type (got "tuple[int, int]", expected "list[str]") [return-value] Found 1 error in 1 file (checked 1 source file)

Incorporating typing.Protocol into our Python codebase not only facilitates earlier error detection and circumvents the inheritance requirement but also subtly enforces the Liskov Substitution Principle (LSP).

By ensuring that objects are replaceable with instances of their subtypes without altering the correctness of the program, typing.Protocol aids in creating more reliable and maintainable code.

This alignment with SOLID principles highlights the broader impact of adopting modern type checking in Python, enhancing both code quality and developer productivity.

Enhancing Protocols with Runtime Checkability

To bridge the gap between static type checking and runtime flexibility, Python’s typing module includes the @typing.runtime_checkable decorator.

This feature allows protocols to be used in runtime type checks, similar to abstract base classes.

Applying @typing.runtime_checkable to a protocol makes it possible to use isinstance() and issubclass() checks, offering a layer of dynamic validation that complements static type checking.

However, it’s important to note the limitation: while runtime checkable protocols can verify the presence of required methods at runtime, they do not validate method signatures or return types as static type checking does.

This offers a practical yet limited approach to enforcing interface contracts dynamically, providing developers with additional tools to ensure their code’s correctness and robustness.

One practical use case for @typing.runtime_checkable is in developing plugins or extensions. When you’re designing an API where third-party developers can provide plugin implementations, runtime checks can be used to verify that an object passed to your API at runtime correctly implements the expected interface.

This can be especially useful in dynamically loaded modules or plugins where static type checks might not catch mismatches. For example, before invoking plugin-specific methods, you might use isinstance(plugin, PluginProtocol) to ensure that the plugin adheres to your defined protocol, enhancing reliability and reducing the risk of runtime errors.

Structural Subtyping and Embracing Python’s Duck Typing

The concept of structural subtyping, formalized through typing.Protocol, is a testament to Python’s commitment to flexibility and the “duck typing” philosophy.

Structural subtyping allows a class to be considered a subtype of another if it meets certain criteria, specifically if it has all the required methods and properties, irrespective of the inheritance relationship.

This approach enables developers to design more generic, reusable components that adhere to specified interfaces without being bound by a strict class hierarchy.

It essentially allows objects to be used based on their capabilities rather than their specific types, echoing the Pythonic saying, “If it looks like a duck and quacks like a duck, it’s a duck.”

By leveraging typing.Protocol, Python developers can enjoy the benefits of static type checking while maintaining the language’s dynamic, expressive nature, ensuring code is both flexible and type-safe.

Further reading
  • Python typing Module Documentation: Dive into the official Python documentation for an in-depth look at type hints. This guide covers everything from basic annotations to advanced features like typing.Protocol, equipping you with the knowledge to write clearer and more maintainable Python code.
  • PEP 544 – Protocols: Structural subtyping (static duck typing): Explore the proposal that introduced typing.Protocol to Python. This document provides valuable context on the motivation behind protocols, detailed examples, and insights into how they enhance Python’s type system, making it an essential read for developers interested in type checking and Python’s design philosophy.
  • Building Implicit Interfaces in Python with Protocol Classes: This article offers a practical approach to using typing.Protocol for defining and implementing interfaces in Python. It’s perfect for readers looking for actionable advice and examples on how to leverage protocols in their projects.
  • Robust Python: Patrick Viafore’s book is a treasure trove of information on Python’s typing system, including a dedicated chapter on typing.Protocol. It’s a great resource for those seeking to deepen their understanding of Python’s type hints and how to use them effectively to write robust, error-resistant code.

Adopting typing.Protocol in the PyBites Search feature showcased not just an alternative to ABCs and abstract methods, but also illuminated a path toward more proactive error management and a deeper alignment with Python’s dynamic and flexible design ethos.

By embracing typing.Protocol, we not only enhance our code’s reliability through early error detection but also open the door to more robust design patterns that leverage Python’s strengths in readability and expressiveness.

The exploration of runtime checkability with @typing.runtime_checkable and the discussion around structural subtyping and duck typing further underscore the versatile and powerful nature of Python’s type system.

These features collectively foster a development environment where code is not just correct, but also elegantly aligned with the principles of modern software design.

As Python continues to evolve, tools like typing.Protocol are invaluable for developers aiming to write high-quality, maintainable code that adheres to both the letter and spirit of Python’s design philosophy.

Categories: FLOSS Project Planets

TechBeamers Python: 20 Practical Pandas Tips and Tricks for Python

Planet Python - Fri, 2024-02-09 10:51

Welcome to this Python tutorial including Pandas tips and tricks! In this guide, we’ll share 20 practical techniques to make your data tasks easier and improve your Python data analysis. Whether you’re new or experienced, these tips will help you become more efficient in using Pandas for data manipulation. Let’s dive in and explore the […]

The post 20 Practical Pandas Tips and Tricks for Python appeared first on TechBeamers.

Categories: FLOSS Project Planets

Thorsten Alteholz: My Debian Activities in January 2024

Planet Debian - Fri, 2024-02-09 10:23
FTP master

This month I accepted 333 and rejected 31 packages. The overall number of packages that got accepted was 342.

Hooray, I already accepted package number 30000.

The statistic, where I get my numbers from, started in February 2002. Up to now 81694 packages got accepted. Given that I accepted package 20000 in October 2020, would I be able to accept half of the packages that made it through NEW?

Debian LTS

This was my hundred-fifteenth month that I did some work for the Debian LTS initiative, started by Raphael Hertzog at Freexian.

During my allocated time I uploaded:

  • [DLA 3726-1] bind9 security update for one CVEs to fix stack exhaustion
  • [#1060186] Bookworm PU-bug for libde265; yes, this is a new one.
  • [#1056935] Bullseye PU-bug for libde; yes, this is a new one as well

This month I was finally able to really run the test suite of bind9. I already wanted to give up with this package, but Santiago encouraged me to proceed. So, here you are fixed-Buster-version. Jessie and Stretch have to wait a bit until the dust has settled.

Last but not least I also did a few days of frontdesk duties.

Debian ELTS

This month was the sixty-sixth ELTS month. During my allocated time I uploaded:

  • [ELA-1031-1]xerces-c security update for one CVE to fix an out-of-bound access in Jessie and Stretch
  • [ELA-1036-1] jasper security update for one CVE to fix an invalid memory write

This month I also worked on the Jessie and Stretch updates for bind9. The uploads should happen soon. I also started to work on an update for exim4. Last but not least I did a few days of frontdesk duties.

Debian Printing

This month I adopted:

At the moment these packages are the last adoptions to preserve the old printing protocol within Debian. If you know of other packages that should be retained, please don’t hesitate to ask me. But don’t wait for too long, I have fun to process RM-bugs :-).

This work is generously funded by Freexian!

Debian Astro

This month I uploaded a new upstream version of:

Debian IoT

This month I uploaded new upstream versions of:

  • pyicloud to remove the deprecated dependency python3-future
Other stuff

This month I uploaded new upstream version of packages, did a source upload for the transition or uploaded it to fix one or the other issue:

Categories: FLOSS Project Planets

Gábor Hojtsy: Upgraded my blog from Drupal 7 to Drupal 10 in less than 24 hours with the open source Acquia Migrate Accelerate

Planet Drupal - Fri, 2024-02-09 08:57
Upgraded my blog from Drupal 7 to Drupal 10 in less than 24 hours with the open source Acquia Migrate Accelerate

After 16 years, I was back in Brussels for another FOSDEM last weekend. Back then, as the lead maintainer, I presented about the brand new Drupal 6 version. Now I revisited the pieces that made the Drupal 8 Multilingual Initiative super successful (dedicated post about that coming soon). That reminded me how much I used this website and other custom websites to support initiatives I worked on, and how I neglected to take care of the site in recent years. I wanted to move a bit forward, and kind of got carried away in the best way possible.

Gábor Hojtsy Fri, 02/09/2024 - 15:57
Categories: FLOSS Project Planets

Web Review, Week 2024-06

Planet KDE - Fri, 2024-02-09 07:12

Let’s go for my web review for the week 2024-06.

We’ve been waiting 20 years for this - The History of the Web

Tags: tech, web, blog, culture

Excellent piece about the resurgence of old trends on the web.


The European regulators listened to the Open Source communities! - Voices of Open Source

Tags: tech, foss, law

Looks like good progress has been made. A few more adjustments would be welcome before it gets ratified.


Google workers complain bosses are ‘inept’ and ‘glassy-eyed’

Tags: tech, google, transparency, management

Things don’t look great in this giant… it’s astonishing how much eroding vision and transparency can hurt an organization.


Netflix: Piracy is Difficult to Compete Against and Growing Rapidly * TorrentFreak

Tags: tech, streaming, video, business

Unsurprising trend… this was a market where there was no chance to have a single dominant platform due to the existing studio behemoths. So now we’re in the streaming wars, people can’t afford to pay for every silos… and they turn to piracy again. Could have been all avoided if we went the “global license” route instead of dumping money on platforms.


Stract search engine

Tags: tech, search, web

A new search engine trying to grow. The approach seems interesting, we’ll see where it goes.


Browsers Are Weird Right Now – Tyler Sticka

Tags: tech, web, browser

A bit opinionated and sarcastic maybe… still it feels right, the web browsers landscape is in a sad state.


Over the Edge: The Use of Design Tactics to Undermine Browser Choice

Tags: tech, microsoft, browser, vendor-lockin

Not unexpected of course, but at least it makes it clear that Microsoft is actively trying to prevent users from using the browser they want.


Microsoft is seeking a software architect to port Microsoft 365 to Rust | TechSpot

Tags: tech, microsoft, rust

If they really commit to it, this means Microsoft will really invest big in Rust. Let’s wait and see…


Blog - How I Also Hacked my Car

Tags: tech, automotive, security

The infotainment systems on car are not as locked down as one might think. Another proof of it.


Rust Won’t Save Us: An Analysis of 2023’s Known Exploited Vulnerabilities – Horizon3.ai

Tags: tech, security, memory, rust

Indeed, not all security issues are due to memory related problems. It’s 20% of the security issues. This is of course massive, but there’s still 80% of the security issues coming from wrong authentication, appliances and so on.


OPML is underrated

Tags: tech, blog, rss, opml

Definitely true. This is a good way to share your feeds with friends or to setup a blogroll.



Tags: tech, zsh, shell

Indeed, use zsh more. It’s good… or at least better.


VirtualBox KVM public release — Cyberus Technology

Tags: tech, linux, virtualization

This is an interesting VirtualBox fork for Linux. Using KVM as backend should bring interesting benefits.


When “letting it crash” is not enough

Tags: tech, erlang, safety, crash, resilience

It’s here to sell something. That said it does a good job explaining the Erlang model and its “let it crash” approach to failures. Also highlights the obvious limitations.


Cool Things You Can Do with SELECT - Edward Loveall

Tags: tech, sql, databases

Indeed, a very underappreciated keyword in SQL. It can do much more than what it’s often used for.


Postgres is Enough · GitHub

Tags: tech, databases, postgresql

Nice index pointing to resources to do many things with Postgres.



Tags: tech, web, 3d, webcomponents

Looks like a really convenient web component to display 3D models on web pages.


Method of Differences

Tags: tech, mathematics, history

Interesting explanation of the method of differences to easily compute polynomials.


How to hire low experience, high potential people

Tags: hr, interviews

Good ideas and questions to interview candidates. I don’t think I would use everything though. Also I think some of what’s proposed here would work for candidates at any level of experience.


Bye for now!

Categories: FLOSS Project Planets

Real Python: The Real Python Podcast – Episode #191: Focusing on Data Science &amp; Less on Engineering and Dependencies

Planet Python - Fri, 2024-02-09 07:00

How do you manage the dependencies of a large-scale data science project? How do you migrate that project from a laptop to cloud infrastructure or utilize GPUs and multiple instances in parallel? This week on the show, Savin Goyal returns to discuss the updates to the open-source framework Metaflow.

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

A comparative view of AI definitions as we move toward standardization

Open Source Initiative - Fri, 2024-02-09 05:54

Discussions of Artificial Intelligence (AI) regulation will be heating up in 2024 with a provisional agreement for the EU AI Act having been reached in December 2023. The evolution of the EU AI Act is progressing toward a technology-neutral definition for AI to be applied to future AI systems. In the coming months, multiple states will agree on precise legal definitions, which reflect moral considerations of the role that AI will and will not be allowed to play in Europe for the very first time. And formally defining AI is an ongoing debate. 

Precise definitions within a rapidly expanding field are perhaps not the first things that come to mind when asked about pressing issues concerning AI. However, as its influence grows, arriving at one seems essential when considering how to regulate it. Agreeing on what AI is–and what it is not–on a transnational level, is proving to be increasingly important. Online spaces rarely respect sovereignty, and the role of AI in public life is expected to increase rapidly. 

Different countries and organizations have different definitions, though the AI Act is expected to provide some standardization, not only within the EU but also outside of it due to its influence. Other than providing a framework for businesses to operate within in the future, it further shows the anticipation of what, how and where AI will act and what it will develop towards. Let’s consider how different organizations and states currently are defining AI systems.


So far, the AI ACT’s definition of AI systems is expected to follow the OECD’s current definition. This currently seems to be the most influential definition and it reads as follows: 

An AI system is a machine-based system that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments. Different AI systems vary in their levels of autonomy and adaptiveness after deployment.

Notably, the OECD’s definition has undergone changes from its first draft to the current one above. The removal of “human-based inputs” and the addition of “decisions” when referring to outputs reflects a potential for vastly limiting human-centred decisions and actions. While acknowledging that different systems vary in their autonomy, this change opens up the potential for full autonomy. This can be controversial, to say the least, and can be expected  to feed into the growing concerns of AI alignment. As we await the EU AI Act, if they indeed adopt the same or even a similar definition, it will be interesting to see their definition of personhood, considering the removal of “human-based” under inputs. 


The International Organization for Standardization has defined AI systems as follows:


<engineered system> set of methods or automated entities that together build, optimize and apply a model (3.1.26) so that the system can, for a given set of predefined tasks (3.1.37), compute predictions (3.2.12), recommendations, or decisions

Note 1 to entry: AI systems are designed to operate with varying levels of automation (3.1.7).

Note 2 to entry: Predictions (3.2.12) can refer to various kinds of data analysis or production (including translating text, creating synthetic images or diagnosing a previous power failure). It does not imply anteriority.

<discipline> study of theories, mechanisms, developments and applications related to artificial intelligence <engineered system> (3.1.2)

AI System:

engineered system featuring AI <engineered system> (3.1.2)

Note 1 to entry: AI systems can be designed to generate outputs such as predictions (3.2.12), recommendations and classifications for a given set of human-defined objectives.

Note 2 to entry: AI systems can be designed to operate with varying levels of automation.

Here, there is a consideration of what kind of system is considered, notably an engineered one. This is interesting as previous definitions have been somewhat ambiguous about what technologies, in fact, will fall under such legislation. There is also a focus on the cooperation of different entities, not specified of human or otherwise. Notably, they do not mention the origin and what kind of input is being processed, though through “varying levels of automation” it can be inferred that it covers the balance between human or non-human inputs, thus offering varying levels of autonomy. 

South Korea

South Korea also adopted their definition of AI system in their 2023 AI Act, and it reads as follows:

Article 2 (Definitions) As used in this Act, the following terms have the following meanings.

  1. “Artificial intelligence” refers to the electronic implementation of human intellectual abilities such as learning, reasoning, perception, judgment, and language comprehension.

  2. “Artificial intelligence technology” means hardware technology required to implement artificial intelligence, software technology that systematically supports it, or technology for utilizing it.

While not mentioning AI systems, they attribute human attributes, like perception, to an electronic entity. While not mentioning “decisions,” attributing human characteristics perhaps makes that point redundant, as it can be interpreted as an actor, acting on a similar level as humans. Further, they are expansive on what technology is considered AI, as even a cable providing power can, under their current definition, be classified as a piece of AI technology. 

US Executive Order

In the last part of 2023, The Biden administration issued an executive order whereby they defined an AI system:

“a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to perceive real and virtual environments; abstract such perceptions into models through analysis in an automated manner; and use model inference to formulate options for information or action.”

Here, The Biden Administration merges human and machine-based inputs, highlighting the cooperation between the two actors. And while not legally binding, it shows intent. It shows more caution and perhaps skepticism regarding AI acting autonomously, as compared to any other of the major actors. Interestingly, the distinction between virtual and “real” (assuming this means physical, though the wording of it remains problematic) environments shows a similar skepticism to the scope and spheres that the Biden Administration is interested in AI occupying. This limits the controversial issue of potential autonomy present in previous definitions, though it limits communication between systems independently of human inputs, which can prove problematic in practice. 

Answers we are excited to see

As we enter into an important legislative year for AI, we are looking forward to getting answers to the following questions regarding the legal definitions of AI systems:

  • What definition of personhood will accompany the AI systems definition in the AI Act? And what does this mean for the intellectual protection of something entirely made by an AI, considering that it allows for large amounts of autonomy? That is, if it indeed follows the same definition as the OECD. 
  • What kind of technology will be considered to be AI? Will it range from Excel spreadsheets to LLMs? Are we considering “machine-based systems,” an “engineered system” or something else?
  • Will legislation be strong enough, or perhaps broad enough, to encompass the massive changes AI is currently undergoing? And what predictions can we infer that the EU is making on behalf of the future advancements of AI?

The post <span class='p-name'>A comparative view of AI definitions as we move toward standardization</span> appeared first on Voices of Open Source.

Categories: FLOSS Research

Abhijith PA: A new(kind of) phone

Planet Debian - Fri, 2024-02-09 05:53

I was using a refurbished Xiaomi Redmi 6 Pro (codename: sakura . I do remember buying this phone to run LineageOS as I found this model had support. But by the time I bought it (there was a quite a gap between my research and the actual purchase) lineageOS ended the suport for this device. The maintainer might’ve ended this port, I don’t blame them. Its a non rewarding work.

Later I found there is a DotOS custom rom available. I did a week of test run. There seems lot of bugs and it all was minor and I can live with that. Since there is no other official port for redmi 6 pro at the time, I settled on this buggy DotOS nightly build.

The phone was doing fine all these(3+) years, but recently the camera started showing greyish dots on some areas to a point that I can’t even properly take a photo. The outer body of the phone wasn’t original, as it was a refurbished, thus the colors started to peel off and wasn’t aesthetically pleasing. Then there is the Location detection issue which took a while to figure out location and battery drains fast. I recently started mapping public transportation routes, and the GPS issue was a problem for me. With all these problems combined, I decided to buy a new phone.

But choosing a phone wasn’t easy. There are far too many options in the market. However, one thing I was quite sure of was that I wouldn’t be buying a brand new phone, but a refurbished one. It is my protest against all these phone manufacturers who have convinced the general public that a mobile phone’s lifespan is only 2 years. Of course, there are a few exceptions, but the majority of players in the Indian market are not.

Now I think about it, I haven’t bought any brand new computing device, be it phones or laptops since 2015. All are refurbished devices except for an Orange pi. My Samsung R439 laptop which I already blogged about is still going strong.

Back to picking phone. So I began by comparing LineageOS website to an online refurbished store to pick a phone that has an official lineage support then to reddit search for any reported hardware failures

Google Pixel phones are well-known in the custom ROM community for ease of installation, good hardware and Android security releases. My above claims are from privacyguides.org and XDA-developers. I was convinced to go with this choice. But I have some one at home who is at the age of throwing things. So I didn’t want to risk buying an expensive Pixel phone only to have it end up with a broken screen and edges. Perhaps I will consider buying a Pixel next time. After doing some more research and cross comparison I landed up on Redmi note 9. Codename: merlinx.

Based on my previous experience I knew it is going to be a pain to unlock the bootloader of Xiaomi phones and I was prepared for that but this time there was an extra hoop. The phone came with ROM MIUI version 13. A small search on XDA forums and reddit told unless we downgrade to 12 or so it is difficult to unlock bootloader for this device. And performing this wasn’t exactly easy as we need tools like SP Flash etc.

It took some time, but I’ve completed the downgrade process. From there, the rest was cakewalk as everything was perfectly documented in LineageOS wiki. And ta-da, I have a phone running lineageOS. I been using it for some time and honestly I haven’t come across any bug in my way of usage.

One advice I will give to you if you are going to flash a custom ROM. There are plenty of videos about unlocking bootloader, installing ROMs in Youtube. Please REFRAIN from watching and experimenting it on your phone unless you figured a way to un brick your device first.

Categories: FLOSS Project Planets

Reproducible Builds (diffoscope): diffoscope 256 released

Planet Debian - Thu, 2024-02-08 19:00

The diffoscope maintainers are pleased to announce the release of diffoscope version 256. This version includes the following changes:

* Use a determistic name when extracting content from GPG artifacts instead of trusting the value of gpg's --use-embedded-filenames. This prevents a potential information disclosure vulnerability that could have been exploited by providing a specially-crafted GPG file with an embedded filename of, say, "../../.ssh/id_rsa". Many thanks to Daniel Kahn Gillmor <dkg@debian.org> for reporting this issue and providing feedback. (Closes: reproducible-builds/diffoscope#361) * Temporarily fix support for Python 3.11.8 re. a potential regression with the handling of ZIP files. (See reproducible-builds/diffoscope#362)

You find out more by visiting the project homepage.

Categories: FLOSS Project Planets

Balint Pekker: To Patch or Not To Patch

Planet Drupal - Thu, 2024-02-08 18:35
There is an ongoing debate among developers regarding the use of patches versus pull requests for contributions. Since its migration to GitLab in 2018, Drupal has undergone significant changes. As of July 2024, the removal of Drupal CI and automated patch testing could potentially change the way contributions are made.
Categories: FLOSS Project Planets

Dirk Eddelbuettel: RcppArmadillo on CRAN: New Upstream, Interface Polish

Planet Debian - Thu, 2024-02-08 18:33

Armadillo is a powerful and expressive C++ template library for linear algebra and scientific computing. It aims towards a good balance between speed and ease of use, has a syntax deliberately close to Matlab, and is useful for algorithm development directly in C++, or quick conversion of research code into production environments. RcppArmadillo integrates this library with the R environment and language–and is widely used by (currently) 1119 other packages on CRAN, downloaded 32.5 million times (per the partial logs from the cloud mirrors of CRAN), and the CSDA paper (preprint / vignette) by Conrad and myself has been cited 575 times according to Google Scholar.

This release brings a new (stable) upstream (minor) release Armadillo 12.8.0 prepared by Conrad two days ago. We, as usual, prepared a release candidate which we tested against the over 1100 CRAN packages using RcppArmadillo. This found no issues, which was confirmed by CRAN once we uploaded and so it arrived as a new release today in a fully automated fashion.

We also made a small change that had been prepared by GitHub issue #400: a few internal header files that were cluttering the top-level of the include directory have been moved to internal directories. The standard header is of course unaffected, and the set of ‘full / light / lighter / lightest’ headers (matching we did a while back in Rcpp) also continue to work as one expects. This change was also tested in a full reverse-dependency check in January but had not been released to CRAN yet.

The set of changes since the last CRAN release follows.

Changes in RcppArmadillo version (2024-02-06)
  • Upgraded to Armadillo release 12.8.0 (Cortisol Injector)

    • Faster detection of symmetric expressions by pinv() and rank()

    • Expanded shift() to handle sparse matrices

    • Expanded conv_to for more flexible conversions between sparse and dense matrices

    • Added cbrt()

    • More compact representation of integers when saving matrices in CSV format

  • Five non-user facing top-level include files have been removed (#432 closing #400 and building on #395 and #396)

Courtesy of my CRANberries, there is a diffstat report relative to previous release. More detailed information is on the RcppArmadillo page. Questions, comments etc should go to the rcpp-devel mailing list off the Rcpp R-Forge page.

If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Categories: FLOSS Project Planets

15-Minute Bug Initiative update

Planet KDE - Thu, 2024-02-08 15:50

A tad over two years ago, I revealed the 15-Minute Bug Initiative–an attempt to improve the out-of-the box user experience for Plasma by focusing on fixing obvious papercut issues. The idea was to crush the meme of “KDE is buggy” the same way we systematically addressed similar complaints like “KDE is ugly” and “KDE is bloated” in years past.

Well, it’s been two years, so how did it go? Let’s talk about it! First some numbers, because we like numbers:

  • Starting number of bugs: 92
  • Current number of bugs: 32
  • Total number of bugs fixed (because more were added over time): 231
  • Percent of all total 15-minute bugs that have been fixed: 87.8%

(note if you’re from the future: if you visit those links, some numbers may be different–hopefully lower for the second one and higher for the third one!)

Wow! That’s quite a few. So this initiative looks like it’s been a real success so far! Nevertheless, 32 bug reports remain open, so we can’t declare victory yet. These are some of the stubbornest, hardest-to-fix bugs that require major re-architecting, working upstream, or similarly challenging efforts. Hopefully the pace of improvement seen over these years has managed to convince you that they’ll eventually be resolved as well.

“Wait a minute, Plasma is still buggy AF you n00b”

Keep in mind these aren’t all the bug reports in the world we can fix (there are over 5000 of them for Plasma and Plasma-aligned software alone), just the ones I and some others have deemed to be most obvious to the average user! If you’re not an average user because you have three monitors arranged in a weird shape, each plugged into a different GPU from a different vendor and its own different DPI, scale factors, and custom Plasma panels, plus 4 activities and 9 virtual desktops and 6 internal disks, only half of which automount, and 12 apps set to autostart, and each of them is installed onto different disks, 15 window rules, and finally a custom theming setup including Kvantum themes and Aurorae window decorations… then yeah, you’re probably going to experience some more bugs compared to a more typical user who doesn’t have such a complex setup!

That’s okay. We care about you too, and we do work on those kinds of more esoteric bugs because many of us also have complex setups! But progress here will be necessarily slower, because the complex setups are more varied, more unusual, and harder to debug. And let’s be honest here: those of us with setups like these are experts capable of working around most bugs we find, who really should be helping to investigate and fix them. Admit it, you know it’s true!

But in a way, this is good: it represents Plasma moving from “It’s generally buggy” to “it’s specifically buggy–buggy with only certain less common combinations of settings, rather than buggy in a way that anyone can find within 15 minutes of using a system with its default settings. Improving on that was the point of this initiative.

Next steps

Up until now we’ve been ignoring Wayland-only bugs here, because the Wayland session was not the default one. Well, with Plasma 6 that changes, so all of those Wayland-only issues that meet the loose criteria to be a 15-minute bug will be promoted. Similarly, current 15-minute bugs that are X11-only will be demoted. So sometime in the next few weeks, expect the list to shift around a bit.

Once the number gets down to 0, it will of course go up again periodically as new bugs are found or introduced. But this is great! It means we can whack them as they appear, rather than letting them pile up over time. In this way the list becomes a “rapid response needed” task list rather than a backlog we’re always behind on.

What happens with those development resources once the 15-minute Plasma bugs are under control? I have a plan, first articulated in the original announcement 2 years ago: extend the program to Frameworks bugs! There are quite a few candidates there too. And because frameworks bugs can affect Plasma and our vast app library, quality will go up overall.

Speaking of those apps, once the 15-minute Frameworks bugs are also down to zero or close to it, we can include app bugs. This will finally give us complete coverage! I have a dream that one day, we’ll have a stable and mostly bug-free UI layer and our limited development resources can be focused more on performance work, sustainably-crafted new features, usability, more standardized styling, etc. I think it’s doubtful we can get there while we’re still battling routine bugs all the time.

How you can help

As always, help work on the existing 15-minute bugs if you can! If not, it’s always useful to work on bug triaging, so that more of those issues that will eventually become 15-minute bugs can get discovered. Another impactful way is to donate to KDE e.V., the nonprofit that support KDE on a ridiculously small budget. We’re still running the Plasma 6 fundraiser which represents a great way to donate!

Categories: FLOSS Project Planets

Drupal Association blog: The Top 6 Benefits of Attending DrupalCon Portland 2024

Planet Drupal - Thu, 2024-02-08 14:51

DrupalCon Portland 2024 promises to be an unmissable event for web developers, designers, and business professionals invested in the Drupal ecosystem. As one of the most significant gatherings in the Drupal community, the conference offers a variety of benefits that extend beyond just technical knowledge. From networking opportunities to staying ahead in the rapidly evolving digital landscape, attending DrupalCon Portland can be a game-changer for professionals. Let's explore 6 of the top benefits of being part of this transformative event!

Cutting-Edge Insights

DrupalCon is renowned for bringing together thought leaders and experts in the Drupal community. Attendees will have the chance to gain insights into the latest trends, innovations, and best practices in web development, ensuring they stay at the forefront of the industry.

Networking and Career Opportunities

Connect with like-minded professionals and Drupal enthusiasts at DrupalCon. The conference offers a networking platform beyond sessions, fostering relationships for potential partnerships, job opportunities, and collaborative projects. Explore career prospects by engaging with hiring companies and recruiters in the Drupal ecosystem, as many organizations actively seek skilled professionals to join their teams.

Skill Enhancement

Join interactive training sessions and learn from industry experts at DrupalCon. These sessions aim to improve your skills and offer practical knowledge for immediate application to your projects. DrupalCon provides a variety of session tracks suitable for different interests and skill levels. Whether you're a beginner or an experienced developer, there are sessions customized to meet your requirements, guaranteeing a comprehensive learning experience.

Stay Informed About Drupal releases

DrupalCon Portland 2024 presents a fantastic chance to stay informed about the latest developments and insights in Drupal's upcoming core and feature releases. By participating, you can stay ahead of the curve by gaining knowledge about the new features, enhancements, and possible challenges that Drupal has in store. This event provides a valuable opportunity to stay current and well-informed within the Drupal community.

Contribute to the Community

DrupalCon provides a comprehensive experience beyond training and discussions, allowing participants to actively contribute through collaborative code sprints. These hands-on sessions, where developers collectively enhance Drupal core and modules, offer valuable practical experience. Beyond coding, attendees can also contribute through non-code avenues such as volunteering and mentoring, enriching their overall DrupalCon experience. This inclusive and collaborative spirit defines DrupalCon, fostering a vibrant community of shared knowledge and expertise.

Vendor Expo

Take advantage of the vendor expo at DrupalCon Portland 2024 to explore cutting-edge tools, services, and technologies that seamlessly integrate with Drupal. Immerse yourself in engaging conversations with industry-leading vendors to gain a deep understanding of their offerings. This is a unique opportunity to identify potential partnerships or solutions that can significantly enhance the effectiveness of your Drupal projects. By actively participating in the expo, you'll be able to discover and leverage the latest innovations that can elevate your Drupal experience.

DrupalCon Portland 2024 is not just a conference; it's an immersive experience that can significantly impact your professional growth. The benefits are extensive, from staying updated on the latest Drupal developments to forging valuable connections and contributing to the open source community. Attendees can expect to leave the conference with enhanced skills, fresh perspectives, and a network of contacts that can propel their careers to new heights.

Registration is now open for DrupalCon Portland 2024, held from 6-9 May at the Oregon Convention Center.

Categories: FLOSS Project Planets

Python People: Nikita Karamov - Django project maintainer, from Russia to Germany

Planet Python - Thu, 2024-02-08 14:08

Nikita Karamov is a Python developer and maintainer on various open source Python projects.

Some topics covered:

  • Notes on university education in programming and engineering vs theory
  • Jazzband for maintaining Django projects
  • Contributing to open source makes you a better programmer
  • Moving from Russia to Germany during college
  • Cultural differences between Russia, Germany, and Oregon
  • The nice lack of drama in the Python community
  • A lack of universities teaching Python for web development

Links from the show

The Complete pytest Course

★ Support this podcast on Patreon ★ <p>Nikita Karamov is a Python developer and maintainer on various open source Python projects.</p><p>Some topics covered:</p><ul><li>Notes on university education in programming and engineering vs theory</li><li>Jazzband for maintaining Django projects</li><li>Contributing to open source makes you a better programmer</li><li>Moving from Russia to Germany during college</li><li>Cultural differences between Russia, Germany, and Oregon</li><li>The nice lack of drama in the Python community</li><li>A lack of universities teaching Python for web development</li></ul><p><br>Links from the show</p><ul><li><a href="https://jazzband.co">Jazzband</a></li><li><a href="https://jazzband.co/projects/django-simple-menu">django-simple-menu</a></li></ul> <br><p><strong>The Complete pytest Course</strong></p><ul><li>Level up your testing skills and save time during coding and maintenance.</li><li>Check out <a href="https://courses.pythontest.com/p/complete-pytest-course">courses.pythontest.com</a></li></ul> <strong> <a href="https://www.patreon.com/PythonPeople" rel="payment" title="★ Support this podcast on Patreon ★">★ Support this podcast on Patreon ★</a> </strong>
Categories: FLOSS Project Planets

lightning @ Savannah: GNU lightning 2.2.3 released!

GNU Planet! - Thu, 2024-02-08 13:51

GNU lightning is a library to aid in making portable programs
that compile assembly code at run time.


Download release:

  GNU Lightning 2.2.3 main new features:

  • PowerPC port now optimize for a variable stack frame size and only create a stack frame if a non leaf function.
  • New callee test to ensure register values saved on the stack are not corrupted when calling a jit or C function. While no problem was found in any port, the new test was added to make sure there were no failures.
  • Add back the jit_hmul interface, from Lightning 1.x. There are special cases where it is desirable to only know the high part of a multiplication.
  • Correct wrong implementation of zero right shift with two registers output.
  • Add new pre and post increment for load and store instructions.
  • Several minor bug fixes.
Categories: FLOSS Project Planets

PyCharm: PyCharm 2024.1 EAP 4: Sticky Lines, and More

Planet Python - Thu, 2024-02-08 12:44

The Early Access Program for PyCharm 2024.1 continues with our latest build where you can preview new features, including convenient sticky lines in the editor, and more.

You can download the new version from our website, update directly from the IDE or via the free Toolbox App, or use snaps for Ubuntu.

Download PyCharm 2024.1 EAP

Learn about the key highlights below.

User experience Sticky lines in the editor

To simplify working with large files and exploring new codebases, we’ve introduced sticky lines in the editor. This feature keeps key structural elements, like the beginnings of classes or methods, pinned to the top of the editor as you scroll. This way, scopes always remain in view, and you can promptly navigate through the code by clicking on a pinned line.

As of PyCharm 2024.1, this feature will be enabled by default. You can disable it via a checkbox in Settings/Preferences | Editor | General | Appearance, where you can also set the maximum number of pinned lines.

Version control systems Option to display review branch changes in a Git Log tab

IntelliJ IDEA 2024.1 EAP 4 streamlines the code review workflow by offering a focused view of branch-related changes. For GitHub, GitLab, and Space, it is now possible to see changes in a certain branch in a separate Log tab within the Git tool window. To do so, click on the branch name in the Pull Requests tool window and pick Show in Git Log from the menu.

These are the most notable improvements in the latest PyCharm 2024.1 EAP build. Explore the full list of changes in the release notes, and stay tuned for more updates next week.

We encourage you to share your opinion about the newest additions in the comments below or by contacting us on X (formerly Twitter). If you spot a bug, please report it via our issue tracker.

Categories: FLOSS Project Planets

TechBeamers Python: Python’s Map() and List Comprehension: Tips and Best Practices

Planet Python - Thu, 2024-02-08 11:20

Welcome to this tutorial where we will explore Python map() and List Comprehension best practices and share some cool coding tips. These techniques, when mastered, can make your code cleaner, more concise, and efficient. In this guide, we’ll explore these concepts from the ground up, using simple and practical examples. Let’s begin mastering the map […]

The post Python’s Map() and List Comprehension: Tips and Best Practices appeared first on TechBeamers.

Categories: FLOSS Project Planets

TechBeamers Python: 20 Practical Python Data Analysis Tips and Tricks

Planet Python - Thu, 2024-02-08 09:00

Hey, welcome! Today, we’re talking about practical Python data analysis tips and tricks. With Pandas and NumPy, we’ll tidy up data, spot trends, and do some smart analysis. It’s all about making your data work easier and getting cool insights. Let’s dive in to optimize your data analysis tasks. Practical Python Tips for Solid Data […]

The post 20 Practical Python Data Analysis Tips and Tricks appeared first on TechBeamers.

Categories: FLOSS Project Planets

Ramsalt Lab: Upgrading your site from Drupal 9 to 10

Planet Drupal - Thu, 2024-02-08 08:44
Upgrading your site from Drupal 9 to 10 Perry Aryee Developer 08.02.2024

With Drupal 9 having reached its end of life (EOL) on November 1, it’s time to start planning for an upgrade.

For those already operating on Drupal 9, upgrading to Drupal 10 is not as daunting as earlier upgrades and promises to be easy, reflecting the software’s overall trend towards smaller, more incremental upgrades and faster iterations. According to Drupal, you should "completely update your Drupal 9 site to the most recent version of the modules and theme(s), before updating to Drupal 10." 

The deprecated code will be based on Drupal 9 moving into Drupal 10, but there are ways to search your system and update the specified code block.

The first thing to do is to install the  drupal/upgrade_status module in your project and enable it. composer require drupal/upgrade_status && drush en upgrade_status, if you have drush installed else go to admin > modules  and search for upgrade status and install it. After the module is enabled, go to Admin > Reports > Upgrade Status. This page should contain all the upgrades and code changes necessary before your site can be upgraded to Drupal 10.

Steps to upgrade Drupal 9 to Drupal 10

Migrating from Drupal 9 to Drupal 10 can be easy or can be difficult depending on the project you are involved in. Yes, because every site is different and may present its own unique challenges.

These are the following steps to migrate from Drupal 9 to Drupal 10.

  1. Keeping your custom modules up to date with new standards and removing deprecated code will result in small changes before you have to do a major core upgrade. If the site is well maintained the changes are mostly with the core_version_requirement. You will need to update from core_version_requirement: ^9 to core_version_requirement: ^9 || ^10 . As soon as we are sure that our custom code is compatible with Drupal 10, we can move to check the contrib modules. And this is where things might get tricky
  1. Upgrade contrib modules to versions which support Drupal 9 and 10. If the new version only supports D10, then you can add it to composer as an alternative version, for example: 'drupal/module_name': '^1 || ^2', and you can have version 1 still installed while on Drupal 9. Once you install Drupal 10, version 2 of the module will be installed by composer.
  1. Uninstall obsolete modules and themes like Color, RDF, Seven etc. You can’t remove core modules or themes, but if you are not sure if these are depended upon, it can be moved from core to contrib especially classy and stable, which you can keep as contrib modules. You may need to set your Admin theme to Claro and re-export your config on this step. Use drush theme:uninstall theme(s) command (drush thun for short) to uninstall themes.
  1. Remove orphaned permissions which are still assigned to user roles. Export your config before starting this step. The upgrade_status module will tell you which permissions need to go from which roles. Simply edit the user.role.[role].yml config ymls and remove the relevant lines from the permissions array. 
  1. Upgrade Drupal core to latest 9.5, in order to upgrade to 10.

composer require drupal/core-recommended:^9.5 drupal/core-composer-scaffold:^9.5  --update-with-all-dependencies

Now let's look at the tricky part. Drupal 10 has some specific things we have to do to allow us to upgrade to drupal 10.

  1. Drupal 10 runs with PHP 8.2 or later so make sure you have it to be able to upgrade. 
  1. Not all modules are Drupal 10 ready so you will have to use drupal lenient.  You will have to get a patch for Drupal 9 only modules. Install mglaman/composer-drupal-lenient before attempting to upgrade. Otherwise these modules will create a dependency problem. After the installation, you need to declare the Drupal 9 modules which should be treated as Drupal 10 modules. For example, to allow drupal/token to be installed run:

composer require mglaman/composer-drupal-lenient

composer config --merge --json extra.drupal-lenient.allowed-list '["drupal/token"]'

  1. If you have Drush installed make sure the version is ^11 || ^12, we use this because some module may not be version 12 compatible.
  1. Drupal Console (drupal/console) is not Drupal 10 compatible and as such we have to remove it. 

composer remove drupal/console --no-update

  1. Upgrade CKEditor 4 to CKEditor 5,if you have custom CKEditor 4 plugins which don’t have an upgrade path to CKEditor 5, then you can keep using CKEditor 4 but as a contrib module. You may have to use it although it is deprecated, to avoid errors when you deploy the Drupal 10 upgrade, when the config import tries to uninstall CKEditor.
  1. If you use Entity Embed, you need to manually replace CKEditor Embed buttons with SVG icons as Drupal 10 is not using png.
  1. Check custom code for core/jquery.once if it does exist update it to use core/once because jquery.once is removed from Drupal 10
  1. If you encounter dependency issues, and composer just won’t let you upgrade to D10, but you are sure that all the dependencies are in order, then the quickest thing to do is to delete the composer.lock file(although its not a best practice), delete vendor folder then run composer install which will install from the composer.json file and create a new .lock file.
  1. Note that you can ignore some false positives such as an empty custom profile, which phpstan can’t scan, and as such upgrade status would complain, the trick would be to create an empty .php file to get the issue fix

After all these are done and ready for upgrade, you can run this command

composer require 'drupal/core-recommended:^10' 'drupal/core-composer-scaffold:^10' 'drupal/core-project-message:^10' --no-update

If   drupal/core happens to be part of the composer.json file, remove it. This dependency is included in drupal/core-recommended and may cause problems. If you have drupal/core-dev installed, you can run this

composer require 'drupal/core-dev:^10' --dev --no-update

Now,let us test perform the update with the --dry-run option: this allows us to see if the update will runs smoothly or might encounter some errors.

composer update --dry-run

If you encounter any errors, walk through the process to resolve them. Resume the process when the errors are resolved, run the update with dependencies to update any transitive module.

composer update -W

Don’t forget to run drush updb and drush cex after the upgrade. This means you should run the upgrade on top of an installed and functioning D9 database.

If you have a custom theme that is  based on classy,seven or stable then don’t forget to install them as  contrib themes - composer require 'drupal/[module_name]'. You should uninstall and remove upgrade status after the upgrade process.

Pathauto has an issue with some config files so, if you have Pathauto installed then you need to update some configs by hand. Namely pathauto.pattern.[name] config files have had their selection_criteria plugins updated. For example node_type becomes entity_bundle:node. or  try the following command

drush eval 'include_once(DRUPAL_ROOT . "/modules/contrib/pathauto/pathauto.install"); pathauto_update_8108()'

In summary the whole update process for a site can be very difficult and ranges from project to project. It could have different modules installed, with some even locked to specific dev versions, as well as heavily patched old major versions of modules which needed to be upgraded. You walk through these steps and iterate through if need be to get your upgrade completed.

Categories: FLOSS Project Planets