Feeds

Mike Driscoll: PyDev of the Week: Phil Ewels

Planet Python - Mon, 2023-11-13 08:30

This week, we welcome Phil Ewels as our PyDev of the Week! Phil is the creator of the rich-click package. You can find Phil on TwitterMastodon,  or BlueSky.

You can also catch up with Phil over on GitHub.

Let’s spend a few moments getting to know Phil better!

Can you tell us a little about yourself (hobbies, education, etc):

I’m a scientist turned software developer from the UK, now living in Sweden. My education and career path is a little winding: I studied Biochemistry at Bristol and went on to do a PhD in Molecular Biology at Cambridge, working with techniques to tease out the three-dimensional structure of DNA in order to figure out why some cancers happen more often than we expect. During school and university I did web development on the side for fun, and at some point the two interests fused and I got into the field of bioinformatics – computational analysis of biological data. I ended up at SciLifeLab in Sweden working at a national center doing DNA sequencing and analysis for academic researchers. I’ve been lucky enough to have the opportunity to be behind some popular open-source software for scientists over the years and recently joined Seqera to focus on building scientific OSS and the communities that drive them.

I have three young children so I don’t have much time for hobbies these days! But if I get a chance I love to get out on my bike or into the mountains, and I spent much of my earlier life rowing competitively.

Why did you start using Python?

I switched from Perl to Python when I moved to Sweden, as it was used by everyone else in the group. It immediately clicked with me and I never looked back. I’ve never had any formal training, but I had some great mentors in those early days – Per Kraulis, Mario Giovacchini and Guillermo Carrasco to name a few. I learnt quickly, mostly through their pull-request reviews!

What other programming languages do you know and which is your favorite?

My first languages were BASIC on my older brother’s BBC Micro and then Actionscript in Macromedia Flash! Then PHP, JavaScript, Perl and Python. More recently, I’ve done a lot of work with Nextflow which is a domain-specific language based on Groovy. If I start a new project I’ll always reach for Python, unless there’s a compelling reason to do otherwise.

What projects are you working on now?

As of this month I’m project manager for open-source software at Seqera, so my official remit covers Nextflow (data pipelines), MultiQC (data visualisation / reporting), Wave (on-the-fly containers) and Fusion (fast virtual filesystems). I’m also the co-founder of nf-core and am heavily involved in writing the developer tools for that community. Those are my day job projects, after that I have a long tail of pet projects lurking on GitHub that never quite get the attention that they deserve! The most recent ones are rich-click (see below) and rich-codex, for auto-generating terminal output screenshots from simple markdown commands.

Which Python libraries are your favorite (core or 3rd party)?

I have a soft spot for user-friendly interfaces and fell in love with Rich by Will McGugan / theTextualize team the moment I came across it. More recently they released Textual which is pure magic and a delight to work with. Before Textual I used Questionary for interactive CLI prompts, which is fun. Oh and I love all ofSebastián Ramírez’s libraries, but especially FastAPI. Those are beautiful in their simplicity and a great source of inspiration, as well as being super intuitive to work with. Also worth a mention are Pyodide and Ruff – neither are really libraries, but both do really cool things with Python code, in different ways.

How did the rich-click package come about?

As mentioned above, I’m a bit of a fan-boy when it comes to Rich. I had used it to pretty good effect to make the CLI output of most of my tools colourful. As a result, the output from Click (used for the CLI interface) started to stand out as being a bit.. bland. I’d been toying with the idea of building a plugin for a few months, then Will launched rich-cli complete with an attractive CLI. I took his code (with his permission, and help) and generalised it into a package that would do the same for any Python package using Click to generate a CLI. My priority was to make it as simple to use as possible: the basic usage is simply `import rich_click as click`, and the CLI output looks magically nicer. The package has since found its way into quite a few people’s projects and inspired native Typer functionality which is cool. Most recently, Daniel Reeves offered to help with maintenance which I hope gives the project a more stable base to live on (he also did a cool new logo!).

What is your elevator pitch for the MultiQC tool?

MultiQC is a data visualisation tool that understands the log outputs from hundreds of different scientific software packages. At the end of a typical genomics analysis run you might have outputs from 10s – 100s of different tools for 10s – 1000s of samples. MultiQC can scoop up the summary statistics from all of these and generate a Human-readable HTML report summarising the key quality-control metrics, letting you spot any outlier samples. Whilst MultiQC’s origins lie in genomics, it doesn’t do anything specific to that field, and it’s built in a modular way that makes it easy to extend to any tool outputs. You can play around with it in your browser (the recent Pyodidetoy I’ve been playing with) here.

Is there anything else you’d like to say?

Just a big thanks to you and everyone in the Python community for being such a lovely bunch of people. Python folk are one of a small handful of enclaves that I still really enjoy interacting with on social media and I think much of that stems from our diverse backgrounds. Long may that continue! Oh, and if anyone fancies writing a rich-pre-commit library that I could use, then that’d be awesome!

 

Thanks for doing the interview, Phil!

The post PyDev of the Week: Phil Ewels appeared first on Mouse Vs Python.

Categories: FLOSS Project Planets

TechBeamers Python: DataProvider in TestNG – All You Need to Know

Planet Python - Mon, 2023-11-13 08:24

This tutorial explains every little detail about the TestNG data providers. For example, what is dataprovider in TestNG, what is its purpose, is the dataprovider same in all versions of TestNG, or added more features? Furthermore, we’ll cover what is the easiest as well as the best method methods to use in Java. Also, we’ll [...]

The post DataProvider in TestNG – All You Need to Know appeared first on TechBeamers.

Categories: FLOSS Project Planets

Kdenlive 23.08.3 released

Planet KDE - Mon, 2023-11-13 08:18

Kdenlive 23.08.3 continues the stabilization effort of this release cycle in preparation for the Qt6 upgrade. Some highlights of this release include: Importing clips is now faster (as part of the performance improvements task); added a new PNG with alpha render profile and fixes the video with alpha render profiles; time remapping can now be applied to sequences and Whisper now works on all systems.

Full log

  • Fix timeremap. Commit.
  • Fix replace clip keeping audio index from previous clip, sometimes breaking audio. Commit. See bug #476612.
  • Create sequence from selection: ensure we have enough audio tracks for AV groups. Commit.
  • Fix timeline duration incorrect after create sequence from timeline selection. Commit.
  • Fix project duration not updating when moving the last clip of a track to another non last position. Commit. See bug #476493.
  • Don’t lose subtitle styling when switching to another sequence. Commit. Fixes bug #476544.
  • Fix crash dropping url to Library. Commit.
  • When dropping multiple files in project bin, improve import speed by not checking if every file is on a remote drive. Commit.
  • Fix titler shadow incorrectly pasted on selection. Commit. Fixes bug #476393.
  • Fix pasted effects not adjusted to track length. Commit.
  • Fix timeline preview ignored in temporary data dialog. Commit. Fixes bug #475980.
  • Speech to text: fix whisper install aborting after 30secs. Commit.
  • Don’t try to generate proxy clips for audio with clipart. Commit.
  • Clip loading: switch to Mlt::Producer probe() instead of fetching frame. Commit.
  • Multiple fixes for time remap losing keyframes. Commit.
  • Add png with alpha render profile. Commit.
  • Fix Mix not correctly deleted on group track move. Commit.
  • Fix rendering with alpha. Commit.
  • Rotoscoping: don’t auto add a second kfr at cursor pos when creating the initial shape, don’t auto add keyframes until there are 2 keyframes created. Commit.
  • Fix keyframe param not correctly enabled when selecting a clip. Commit.
  • Fix smooth keyframe path sometimes incorrectly drawn on monitor. Commit.
  • Properly adjust timeline clips on sequence resize. Commit.
  • Remove unused debug stuff. Commit.
  • Fix project duration not correctly updated on hide / show track. Commit.
  • Fix resize clip with mix test. Commit.
  • Fix resize clip start to frame 0 of timeline not correctly working in some zoom levels,. Commit.

The post Kdenlive 23.08.3 released appeared first on Kdenlive.

Categories: FLOSS Project Planets

Django Weblog: 2024 DSF Board Candidates

Planet Python - Mon, 2023-11-13 07:32
Thank you to the twelve individuals who have chosen to stand for election. This page contains their submitted candidate statements. Our deepest gratitude goes to our departing board member, Aaron Bassett, for your contributions and commitment to the Django community. Those eligible to vote in this election will receive information on how to vote shortly. Please check for an email with the subject line "2024 DSF Board Voting". Chris Achinga Mombasa, Kenya

My Software development career was highly influenced by developer communities. Participating in tech meet-ups and events, notably DjangoCon Africa, has not only expanded my technical skills but also shaped my approach to both personal and professional growth. This experience has motivated me to seek a position on the Django Software Foundation Board, especially after the talks from Anna Makarudze on Navigating the Open-Source World as a Minority, that highlighted the challenges of organising events that benefits African communities As an advocate for African and minority communities within the tech ecosystem, I aspire to bring a unique and necessary perspective to the DSF Board. My commitment to volunteering and giving back to the community aligns perfectly with the ethos of the Django community. My experiences have taught me the value of dedicated community organizers who selflessly share resources and knowledge, fostering an environment where developers at all levels can thrive.

Joining the DSF Board would enable me to champion the interests of young and emerging developers globally, particularly from underrepresented regions. I aim to ensure that everyone, regardless of their background, has equitable access to the opportunities that Django, both as a community and a web development framework, can offer.

In my role with a Non-Governmental Organization aiding youth groups along the Kenyan Coast(Swahilipot Hub Foundation), I've garnered experience in community engagement and utilizing technology for social good. This experience has been instrumental in creating Django-based platforms that empower community self-management. My presence on the DSF Board would not only represent these communities but also allow me to serve as a mentor and technical advisor.

I am eager to contribute my insights and leadership to the DSF Board. With your support, I hope to make a meaningful impact, fostering an inclusive and dynamic environment where every developer can achieve their full potential.

David Vaz Porto, Portugal

Software developer for over 20 years, fell in love with django almost at the beginning of his journey 2007, version 0.96. He loves Django and Python so much he has been bringing developers to the community since then, ended up starting his consultancy firm around these technologies.

During DjangoCon Europe 2019 at Copenhagen he decided to take the next step helping the community, proposing to organize DjangoCon Europe 2020 in Portugal. He got more than he bargained for, ending up co-organising the first virtual-only DjangoCon Europe, repeating in 2021, and finally a hybrid DjangoCon Europe in 2022. His effort, together with the team around him, was rewarded with success, the 2022 edition had record breaking attendees with 500+ in person and 200+ online. To keep things going he is also co-organising DjangoCon Europe in 2024 in Spain Vigo, hoping to bring the Spanish community closer.

David is also contributing to the Portuguese Python Community, starting in 2022 the very first PyCon Portugal. His drive is to bring The Portuguese community forward, with a different city every year to increase the reach of the conference. The first edition was in Porto, leveraging on DjangoCon Europe 2022, this year it was in Coimbra, with participants from over 25 countries, and we are already preparing the next edition.

David is enthusiastic, committed and pragmatic. Throughout his personal and professional journey, he has always had a positive impact in every process he puts his mind on, influencing, building and empowering the people around him. He hopes to put his experience to good use in Django Software Foundation.

Jacob Kaplan-Moss Oregon

I was one of the original maintainers of Django, and was the original founder and first President of the DSF. I re-joined the DSF board and have served for the last year. Outside of Django, I'm a security consultant at Latacora, and have previously ran engineering and security teams at 18F and Heroku.

When I ran for the board last year, I wrote:

> I'd be coming back to the DSF with a bunch of experience in executive leadership and more experience working with nonprofits. I think I can apply those skills, along with my general knowledge of the Django community, to push things forward. What that means, specifically, isn't entirely clear yet. I'd plan to spend the first months of my board term asking a bunch of questions and listening.

I did that asking-questions-and-listening, and what needs doing at the DSF became clear. I'd most succinctly articulate it as: "new blood".

The Django community is super-vibrant and new people are joining the community all the time, but it's very hard for people to "level up" and move to any sort of leadership position at the DSF or among the core team. We just don't have very many opportunities for people to have an impact, and we don't have good "onramps" to that work.

So, this term, I (with the rest of the board) started building some of these opportunities onramps! The recently-announced working group and membership changes are the start of this, and if re-elected I'd want to continue working in this direction. It's now easier for people to join the DSF, and easier for them to spin up working groups to do impactful work. But now we need to start defining these groups, funding them, and continuing this growth.

Jay Miller United States

The Django community often serves as a great example for many aspects of the broader Python community. Our community shines when many of us get involved. To make this happen, we need to encourage greater community involvement.

My goals for the next two years, if elected, are to increase the amount of information we share with the community while reducing the time it takes to disseminate that information to the community.

I intend to utilize the existing channels in the Django and the larger Python community. We will also establish new official communication channels for the foundation. These channels will be managed by a Communications Working Group.

The second effort is to extend our reach to a global and diverse audience. We understand that our impact can extend far beyond our current scope by expanding working groups. Therefore, I would work to create and support working groups that currently lack direct representation in the DSF. I would also advocate for decisions that directly impact these areas to be developed and executed by those individual groups with DSF support.

I hope that you will support me in this vision, which aims to increase the visibility and support of the DSF to the farthest reaches of the community.

Mahmoud Nassee Cairo/Egypt

I really like helping people and also helping this awesome community to grow. I don't have much to say 🙂.. But I really like volunteering work it helps me to make something that I could be proud of and also make some new friends!

Ngazetungue Muheue Namibia

I'm Ngazetungue Muheue, a dedicated software developer, community advocate, and a member of the Django Software Foundation (DSF). I'm also the founder of the Python and Django Community in Namibia. Despite facing unique challenges as a member of underprivileged communities and living with a disability, I've played a significant role in expanding Django by establishing and contributing to various Django and Python communities in Africa and Namibia.

Recognizing the importance of open-source communities and user-friendly technology, I've worked closely with students and underprivileged individuals to bridge the tech gap by involving them in Django user groups, teaching Django, and fostering their participation in the global tech community. As a visionary leader, I've cultivated a culture of collaboration, inclusivity, and continuous learning within the African tech ecosystem. My contributions include organizing the inaugural DjangoCon Africa in 2023 and actively participating in organizing and volunteering at DjangoCon Europe in 2023 and 2022, advancing the growth of the Django ecosystem. I've also spoken at various PyCon events worldwide, showcasing my commitment to fostering the global Django and Python community.

As a board member of the Django Software Foundation, my primary goal is to expand Django communities worldwide, connect underprivileged community members with the DSF, and enhance the inclusivity of the Django community. This involves translating Django documentation for non-English speakers, increasing project grants, integrating people with disabilities into the community, and creating internship opportunities for a more diverse and empowered Django community.

Joining the DSF board will enable me to inspire and support nations in engaging young and underprivileged individuals in tech-related activities while safeguarding the interests and mission of our community and the DSF. More links: https://twitter.com/muheuenga https://2023.djangocon.africa/team https://twitter.com/djangonamibia https://na.pycon.org/ https://pynam.org/django/

Paolo Melchiorre Pescara, Italy

Ciao, I'm Paolo and I live in Italy.

I've been a contributor to the Django project for years, and a member of the DSF. I attended my first DjangoCon Europe in 2017 and have since presented many Django talks at conferences around the world. I've participated as a coach in DjangoGirls workshops several times, and I organized one in my hometown. I've always been a Python developer, I helped the PyCon Italia organization for a few years and I recently founded the Python Pescara meetup.

As a member of the DSF board of directors, I would like to bring a different point of view to the foundation, as a southern European citizen, inhabitant of the Mediterranean area, non-native English speaker, and a small company employee.

Some initiatives I would like to carry forward are:

  • organize active user sprints to focus on specific Django features
  • continue the work of renovating the Django project website
  • create synergies with the Python community and its web sub-communities
  • simplify Django documentation and help its translations
  • support creators of Django content (e.g. books, articles, podcasts, videos, ...)
  • Peter Baumgartner Colorado, USA

    I'm a current DSF board member and acting Treasurer.

    I've been a part of the Django community for over 15 years. I'm an open-source contributor, a regular speaker at DjangoCon US, and the co-author of High Performance Django. In 2007, I founded Lincoln Loop, a web agency that leverages Django extensively in its work. Lincoln Loop has financially sponsored the DSF and DjangoCon for many years, and I'm looking for other ways to give back to a community that has given us so much.

    At Lincoln Loop, I have to wear many hats and deeply understand the financial ramifications of our decisions as a company. I believe the experience of running a business will be directly applicable to a position on the DSF board, and I look forward to applying that experience if elected.

    Sarah Abderemane Paris, France

    I'm an active DSF member and I've been contributing to this amazing community via multiple ways:

  • Django contributor and Accessibility Team Member
  • Maintainer of djangoproject.com
  • Organizer of Djangonaut Space
  • Organizer of Django Paris Meetup
  • Organizer of DjangoCon Europe 2023

    I have seen many aspects of the community through all those experiences. As a relatively new member, I can bring a fresh perspective to the community and help foster a renewed sense of togetherness. I have a strong connection with Djangonaut Space mentoring program and the community. I'm well positioned to serve as an intermediary, facilitating communication regarding initiatives and ideas between the board and the community.

    I would like to increase fundraising by improving communication and making improvements to make each sponsor special by highlighting sponsors not only on the website but also on social networks. Relying on my experiences with various Django projects, I will push forward ideas to further develop our community, specifically helping existing and new contributors.

    With the community's support, I will set up a working group for mentorship and push accessibility in the framework. I am passionate about these topics as they show that Django is a framework for everyone by everyone.

    I see myself as a representative of Django's diversity and would like to emphasize and expand the richness of it even more. Being part of the board would inspire people to get involved and be part of the community. They could add their stone to the building of this wonderful community.
  • Thibaud Colas Europe

    To me, Django feels like it's in maintenance mode, a decade behind in areas like front-end development and serverless. To stay relevant compared to projects with tens of millions in venture capital, we need a more vibrant, more diverse community. We can build one together by making the right programs happen, like Djangonaut Space and Outreachy.

    The DSF also needs to evolve with the times. In the age of ChatGPT, copyright and trademarks are very dated concerns. We need a foundation that can help its community navigate modern societal challenges: social equity issues affecting our users; accessibility issues plaguing the Django web; climate change and Django's carbon footprint.

    I can help. Let's grow Django's contributors 10x, and have the Django universe lead by example in community-driven open source.

    Tom Carrick Amsterdam, Netherlands

    I've been using Django since 2008. A lot has changed since then, but one constant has been my wish to see Django continuously improve.

    I'm active in the community in many ways. I've been a regular code contributor since 2016. I founded the accessibility team, and also started the official Discord server. So I've dedicated quite some time to Django already, but I have room for more, with even more impact.

    I would like to help grow the next generation of Django contributors, from more diverse backgrounds. From running DjangoCon sprint tables over the years, and getting involved with Djangonaut Space, it's clear to me that the new contributor experience has substantial room for improvement.

    I also want to expand Django's fundraising efforts. It's becoming difficult to add important new features. We need more funding to hire more Fellows, and expand their remit to work on bigger features.

    The new working groups are a much needed initiative, and I'd love to help develop all these ideas to their fullest potential.

    Velda Kiara Nairobi, Kenya

    As a passionate software developer and technical writer deeply rooted in the open-source community, I am honored to be running for the DSF board. My experience in contributing to open-source projects, coupled with my leadership background in the Open Source Community Africa Nairobi, has ignited my desire to enhance the participation and contributions of communities from diverse backgrounds. My involvement in open-source initiatives has made me appreciate the power of collaboration and the impact of collective efforts. I have witnessed firsthand how open-source communities foster innovation and inclusivity, enabling individuals from all over the world to share their knowledge and expertise.

    Driven by my belief of open source impact, I aspire to elevate the DSF board's decision-making process by incorporating the unique perspectives and insights of communities from diverse backgrounds. My experience working with developer communities has equipped me with the skills and empathy necessary to understand and address the specific needs of these underrepresented groups. As a leader, I prioritize decision-making that aligns with the needs and aspirations of the community. I believe in fostering an environment where everyone feels empowered to participate, contribute, and lead. My commitment to inclusivity extends beyond the color of one's skin; I envision a DSF community that embraces and celebrates the diversity of thought, experience, and background.

    My passion for Django and my role as an advocate for the framework extend beyond personal preference. I recognize the immense value of Django to the developer community and am eager to contribute further through the DSF board. I believe that my involvement will allow me to add value to the Django community, supporting its growth and ensuring that it remains a thriving hub for developers worldwide. My journey in the open-source community began with a fascination for the framework. However, over time, I have come to realize that the true beauty of open-source lies in the community that surrounds it. I am committed to giving back to this community, not just as a developer or technical writer, but also as a leader and advocate for diversity and inclusion.

    I humbly ask for your vote to join the DSF board and contribute my skills, experience, and passion to the continued growth and success of the Django community. Together, we can create a more inclusive and vibrant open-source ecosystem that empowers individuals from all backgrounds to innovate, collaborate, and make a lasting impact on the world.

    Categories: FLOSS Project Planets

    ADCI Solutions: How to implement geo-dependent content on a Drupal website

    Planet Drupal - Mon, 2023-11-13 05:09

    Geo-dependent content is a type of website content that changes depending on the user's location data. Here’s how to implement it on your Drupal website.

     

    Categories: FLOSS Project Planets

    Python GUIs: How to Restore the Window's Geometry in a PyQt App — Make Your Windows Remember Their Last Geometry

    Planet Python - Mon, 2023-11-13 01:00

    In GUI applications the window's position & size are known as the window geometry. Saving and restoring the geometry of a window between executions is a useful feature in many applications. With persistent geometry users can arrange applications on their desktop for an optimal workflow and have the applications return to those positions every time they are launched.

    In this tutorial, we will explore how to save and restore the geometry and state of a PyQt window using the QSettings class. With this functionality, you will be able to give your applications a usability boost.

    To follow along with this tutorial, you should have prior knowledge of creating GUI apps with Python and PyQt. Additionally, having a basic understanding of using the QSettings class to manage an application's settings will be beneficial.

    Table of Contents Understanding a Window's Geometry

    PyQt defines the geometry of a window using a few properties. These properties represent a window's position on the screen and size. Here's a summary of PyQt's geometry-related properties:

    Property Description Access Method x Holds the x coordinate of a widget relative to its parent. If the widget is a window, x includes any window frame and is relative to the desktop. This property defaults to 0. x() y Holds the y coordinate of a widget relative to its parent. If the widget is a window, y includes any window frame and is relative to the desktop. This property defaults to 0. y() pos Holds the position of the widget within its parent widget. If the widget is a window, the position is relative to the desktop and includes any frame. pos() geometry Holds the widget's geometry relative to its parent and excludes the window frame. geometry() width Holds the width of the widget, excluding any window frame. width() height Holds the height of the widget, excluding any window frame. height() size Holds the size of the widget, excluding any window frame. size()

    In PyQt, the QWidget class provides the access methods in the table above. Note that when your widget is a window or form, the first three methods operate on the window and its frame, while the last four methods operate on the client area, which is the window's workspace without the external frame.

    Additionally, the x and y coordinates are relative to the screen of your computer. The origin of coordinates is the upper left corner of the screen, at which point both x and y are 0.

    Let's create a small demo app to inspect all these properties in real time. To do this, go ahead and fire up your code editor or IDE and create a new Python file called geometry_properties.py. Then add the following code to the file and save it in your favorite working directory:

    python from PyQt6.QtWidgets import ( QApplication, QLabel, QMainWindow, QPushButton, QVBoxLayout, QWidget, ) class Window(QMainWindow): def __init__(self): super().__init__() self.setWindowTitle("Window's Geometry") self.resize(400, 200) self.central_widget = QWidget() self.global_layout = QVBoxLayout() self.geometry_properties = [ "x", "y", "pos", "width", "height", "size", "geometry", ] for prop in self.geometry_properties: self.__dict__[f"{prop}_label"] = QLabel(f"{prop}:") self.global_layout.addWidget(self.__dict__[f"{prop}_label"]) button = QPushButton("Update Geometry Properties") button.clicked.connect(self.update_labels) self.global_layout.addWidget(button) self.central_widget.setLayout(self.global_layout) self.setCentralWidget(self.central_widget) def update_labels(self): for prop in self.geometry_properties: self.__dict__[f"{prop}_label"].setText( f"{prop}: {getattr(self, prop)()}" ) if __name__ == "__main__": app = QApplication([]) window = Window() window.show() app.exec()

    Wow! There's a lot of code in this file. First, we import the required classes from PyQt6.QtWidgets. Then, we create our app's main window by inheriting from QMainWindow.

    In the initializer method, we set the window's title and size using setWindowTitle() and resize(), respectively. Next, we define a central widget and a layout for our main window.

    We also define a list of properties. We'll use that list to add some QLabel objects. Each label will show a geometry property and its current values. The Update Geometry Properties button allows us to update the value of the window's geometry properties.

    Finally, we define the update_labels() method to update the values of all the geometry properties using their corresponding access methods. That's it! Go ahead and run the app. You'll get the following window on your screen:

    A Window Showing Labels for Every Geometry Property

    Looking good! Now go ahead and click the Update Geometry Properties button. You'll see how all the properties get updated. Your app's window will look something like this:

    A Window Showing the Current Value of Every Geometry Property

    As you can see, x and y are numeric values, while pos is a QPoint object with x and y as its coordinates. These properties define the position of this window on your computer screen.

    The width and height properties are also numeric values, while the size property is a QSize object defined after the current width and height.

    Finally, the geometry property is a QRect object. In this case, the rectangle comprises x, y, width, and height.

    Great! With this first approach to how PyQt defines a window's geometry, we're ready to continue digging into this tutorial's main topic: restoring the geometry of a window in PyQt.

    Keeping an App's Geometry Settings: The QSetting Class

    Users of GUI apps will generally expect the apps to remember their settings across sessions. This information is often referred to as settings or preferences. In PyQt applications, you'll manage settings and preferences using the QSettings class. This class allows you to have persistent platform-independent settings in your GUI app.

    A commonly expected feature is that the app remembers the geometry of its windows, particularly the main window.

    In this section, you'll learn how to save and restore the window's geometry in a PyQt application. Let's start by creating a skeleton PyQt application to kick things off. Go ahead and create a new Python file called geometry.py. Once you have the file opened in your favorite code editor or IDE, then add the following code:

    python from PyQt6.QtWidgets import QApplication, QMainWindow class Window(QMainWindow): def __init__(self): super(Window, self).__init__() self.setWindowTitle("Window's Geometry") self.move(50, 50) self.resize(400, 200) if __name__ == "__main__": app = QApplication([]) window = Window() window.show() app.exec()

    This code creates a minimal PyQt app with an empty main window. The window will appear at 50 pixels from the upper left corner of your computer screen and have a size of 400 by 200 pixels.

    We'll use the above code as a starting point to make the app remember and restore the main window's geometry across sessions.

    First, we need to have a QSettings instance in our app. Therefore, you have to import QSettings from PyQt6.QtCore and instantiate it as in the code below:

    python from PyQt6.QtCore import QSettings from PyQt6.QtWidgets import QApplication, QMainWindow class Window(QMainWindow): def __init__(self): super(Window, self).__init__() self.setWindowTitle("Window's Geometry") self.move(50, 50) self.resize(400, 200) self.settings = QSettings("PyhonGUIs", "GeometryApp")

    When instantiating QSettings, we must provide the name of our company or organization and the name of our application. We use "PyhonGUIs" as the organization and "GeometryApp" as the application name.

    Now that we have a QSettings instance, we should implement two methods. The first method should allow you to save the app's settings and preferences. The second method should help you read and load the settings. In this tutorial, we'll call these methods write_settings() and read_settings(), respectively:

    python class Window(QMainWindow): # ... def write_settings(self): # Write settings here... def read_settings(self): # Read settings here...

    Note that our methods don't do anything yet. You'll write them in a moment. For now, they're just placeholders.

    The write_settings() method must be called when the user closes or terminates the application. This way, you guarantee that all the modified settings get saved for the next session. So, the appropriate place to call write_settings() is from the main window's close event handler.

    Let's override the closeEvent() method as in the code below:

    python class Window(QMainWindow): # ... def closeEvent(self, event): self.write_settings() super().closeEvent(event) event.accept()

    In this code, we override the closeEvent() handler method. The first line calls write_settings() to ensure that we save the current state of our app's settings. Then, we call the closeEvent() of our superclass QMainWindow to ensure the app's window closes correctly. Finally, we accept the current event to signal that it's been processed.

    Now, where should we call read_settings() from? In this example, the best place for calling the read_settings() method is .__init__(). Go ahead and add the following line of code to the end of your __init__() method:

    python class Window(QMainWindow): def __init__(self): super().__init__() self.setWindowTitle("Window's Geometry") self.move(50, 50) self.resize(400, 200) self.settings = QSettings("PythonGUIs", "GeometryApp") self.read_settings()

    By calling the read_settings() method from __init__(), we ensure that our app will read and load its settings every time the main window gets created and initialized.

    Great! We're on the way to getting our application to remember and restore its window's geometry. First, you need to know that you have at least two ways to restore the geometry of a window in PyQt:

    • Using the pos and size properties
    • Using the geometry property

    In both cases, you need to save the current value of the selected property and load the saved value when the application starts. To kick things off, let's start with the first approach.

    Restoring the Window's Geometry With pos and size

    In this section, we'll first write the required code to save the current value of pos and size by taking advantage of our QSettings object. The code snippet below shows the changes that you need to make on your write_settings() method to get this done:

    python class Window(QMainWindow): # ... def write_settings(self): self.settings.setValue("pos", self.pos()) self.settings.setValue("size", self.size())

    This code is straightforward. We call the setValue() method on our setting object to set the "pos" and "size" configuration parameters. Note that we get the current value of each property using the corresponding access method.

    With the write_settings() method updated, we're now ready to read and load the geometry properties from our app's settings. Go ahead and update the read_settings() method as in the code below:

    python class Window(QMainWindow): # ... def read_settings(self): self.move(self.settings.value("pos", defaultValue=QPoint(50, 50))) self.resize(self.settings.value("size", defaultValue=QSize(400, 200)))

    The first line inside read_settings() retrieves the value of the "pos" setting parameter. If there's no saved value for this parameter, then we use QPoint(50, 50) as the default value. Next, the move() method moves the app's window to the resulting position on your screen.

    The second line in read_settings() does something similar to the first one. It retrieves the current value of the "size" parameter and resizes the window accordingly.

    Great! It's time for a test! Go ahead and run your application. Then, move the app's window to another position on your screen and resize the window as desired. Finally, close the app's window to terminate the current session. When you run the app again, the window will appear in the same position. It will also have the same size.

    If you have any issues completing and running the example app, then you can grab the entire code below:

    python from PyQt6.QtCore import QPoint, QSettings, QSize from PyQt6.QtWidgets import QApplication, QMainWindow class Window(QMainWindow): def __init__(self): super().__init__() self.setWindowTitle("Window's Geometry") self.move(50, 50) self.resize(400, 200) self.settings = QSettings("PyhonGUIs", "GeometryApp") self.read_settings() def write_settings(self): self.settings.setValue("pos", self.pos()) self.settings.setValue("size", self.size()) def read_settings(self): self.move(self.settings.value("pos", defaultValue=QPoint(50, 50))) self.resize(self.settings.value("size", defaultValue=QSize(400, 200))) def closeEvent(self, event): self.write_settings() super().closeEvent(event) event.accept() if __name__ == "__main__": app = QApplication([]) window = Window() window.show() app.exec()

    Now you know how to restore the geometry of a window in a PyQt app using the pos and size properties. It's time to change gears and learn how to do this using the geometry property.

    Restoring the Window's Geometry With geometry

    We can also restore the geometry of a PyQt window using its geometry property and the restoreGeometry() method. To do that, we first need to save the current geometry using our QSettings object.

    Go ahead and create a new Python file in your working directory. Once you have the file in place, add the following code to it:

    python from PyQt6.QtCore import QByteArray, QSettings from PyQt6.QtWidgets import QApplication, QMainWindow class Window(QMainWindow): def __init__(self): super().__init__() self.setWindowTitle("Window's Geometry") self.move(50, 50) self.resize(400, 200) self.settings = QSettings("PythonGUIs", "GeometryApp") self.read_settings() def write_settings(self): self.settings.setValue("geometry", self.saveGeometry()) def read_settings(self): self.restoreGeometry(self.settings.value("geometry", QByteArray())) def closeEvent(self, event): self.write_settings() super().closeEvent(event) event.accept() if __name__ == "__main__": app = QApplication([]) window = Window() window.show() app.exec()

    There are only two changes in this code compared to the code from the previous section. We've modified the implementation of the write_settings() and read_settings() methods.

    In write_settings(), we use the setValue() to save the current geometry of our app's window. The saveGeometry() allows us to access and save the current window's geometry. In read_settings(), we call the value() method to retrieve the saved geometry value. Then, we use restoreGeometry() to restore the geometry of our window.

    Again, you can run the application consecutive times and change the position and size of its main window to ensure your code works correctly.

    Restoring the Window's Geometry and State

    If your app's window has toolbars and dock widgets, then you want to restore their state on the parent window. To do that, you can use the restoreState() method. To illustrate this, let's reuse the code from the previous section.

    Update the content of write_settings() and read_settings() as follows:

    python class Window(QMainWindow): # ... def write_settings(self): self.settings.setValue("geometry", self.saveGeometry()) self.settings.setValue("windowState", self.saveState()) def read_settings(self): self.restoreGeometry(self.settings.value("geometry", QByteArray())) self.restoreState(self.settings.value("windowState", QByteArray()))

    In write_settings(), we add a new setting value called "windowState". To keep this setting, we use the saveState() method, which saves the current state of this window's toolbars and dock widgets. Meanwhile, in read_settings(), we restore the window's state by calling the value() method, as usual, to get the state value back from our QSettings object. Finally, we use restoreState() to restore the state of toolbars and dock widgets.

    Now, to make sure that this new code works as expected, let's add a sample toolbar and a dock window to our app's main window. Go ahead and add the following methods right after the __init__() method:

    python from PyQt6.QtCore import QByteArray, QSettings, Qt from PyQt6.QtWidgets import QApplication, QDockWidget, QMainWindow class Window(QMainWindow): def __init__(self): super().__init__() self.setWindowTitle("Window's State") self.resize(400, 200) self.settings = QSettings("PythonGUIs", "GeometryApp") self.create_toolbar() self.create_dock() self.read_settings() def create_toolbar(self): toolbar = self.addToolBar("Toolbar") toolbar.addAction("One") toolbar.addAction("Two") toolbar.addAction("Three") def create_dock(self): dock = QDockWidget("Dock", self) dock.setAllowedAreas( Qt.DockWidgetArea.LeftDockWidgetArea | Qt.DockWidgetArea.RightDockWidgetArea ) self.addDockWidget(Qt.DockWidgetArea.LeftDockWidgetArea, dock) # ...

    In this new update, we first import the Qt namespace from PyQt6.QtCore and QDockWidget from PyQt6.QtWidgets. Then we call the two new methods from __init__() to create the toolbar and dock widget at initialization time.

    In the create_toolbar() method, we create a sample toolbar with three sample buttons. This toolbar will show at the top of our app's window by default.

    Next, we create a dock widget in create_dock(). This widget will occupy the rest of our window's working area.

    That's it! You're now ready to give your app a try. You'll see a window like the following:

    A Window Showing a Sample Toolbar and a Dock Widget

    Play with the toolbar and the dock widget. Move them around. Then close the app's window and run the app again. Your toolbar and dock widget will show in the last position you left them.

    Conclusion

    Through this tutorial, you have learned how to restore the geometry and state of a window in PyQt applications using the QSettings class. By utilizing the pos, size, geometry, and state properties, you can give your users the convenience of persistent position and size on your app's windows.

    With this knowledge, you can enhance the usability of your PyQt applications, making your app more intuitive and user-friendly.

    Categories: FLOSS Project Planets

    Freexian Collaborators: Monthly report about Debian Long Term Support, October 2023 (by Roberto C. Sánchez)

    Planet Debian - Sun, 2023-11-12 19:00

    Like each month, have a look at the work funded by Freexian’s Debian LTS offering.

    Debian LTS contributors

    In October, 18 contributors have been paid to work on Debian LTS, their reports are available:

    • Adrian Bunk did 8.0h (out of 7.75h assigned and 10.0h from previous period), thus carrying over 9.75h to the next month.
    • Anton Gladky did 9.5h (out of 9.5h assigned and 5.5h from previous period), thus carrying over 5.5h to the next month.
    • Bastien Roucariès did 16.0h (out of 16.75h assigned and 1.0h from previous period), thus carrying over 1.75h to the next month.
    • Ben Hutchings did 8.0h (out of 17.75h assigned), thus carrying over 9.75h to the next month.
    • Chris Lamb did 17.0h (out of 17.75h assigned), thus carrying over 0.75h to the next month.
    • Emilio Pozuelo Monfort did 17.5h (out of 17.75h assigned), thus carrying over 0.25h to the next month.
    • Guilhem Moulin did 9.75h (out of 17.75h assigned), thus carrying over 8.0h to the next month.
    • Helmut Grohne did 1.5h (out of 10.0h assigned), thus carrying over 8.5h to the next month.
    • Lee Garrett did 10.75h (out of 17.75h assigned), thus carrying over 7.0h to the next month.
    • Markus Koschany did 30.0h (out of 30.0h assigned).
    • Ola Lundqvist did 4.0h (out of 0h assigned and 19.5h from previous period), thus carrying over 15.5h to the next month.
    • Roberto C. Sánchez did 12.0h (out of 5.0h assigned and 7.0h from previous period).
    • Santiago Ruano Rincón did 13.625h (out of 7.75h assigned and 8.25h from previous period), thus carrying over 2.375h to the next month.
    • Sean Whitton did 13.0h (out of 6.0h assigned and 7.0h from previous period).
    • Sylvain Beucler did 7.5h (out of 11.25h assigned and 6.5h from previous period), thus carrying over 10.25h to the next month.
    • Thorsten Alteholz did 14.0h (out of 14.0h assigned).
    • Tobias Frost did 16.0h (out of 9.25h assigned and 6.75h from previous period).
    • Utkarsh Gupta did 0.0h (out of 0.75h assigned and 17.0h from previous period), thus carrying over 17.75h to the next month.
    Evolution of the situation

    In October, we have released 49 DLAs.

    Of particular note in the month of October, LTS contributor Chris Lamb issued DLA 3627-1 pertaining to Redis, the popular key-value database similar to Memcached, which was vulnerable to an authentication bypass vulnerability. Fixing this vulnerability involved dealing with a race condition that could allow another process an opportunity to establish an otherwise unauthorized connection. LTS contributor Markus Koschany was involved in the mitigation of CVE-2023-44487, which is a protocol-level vulnerability in the HTTP/2 protocol. The impacts within Debian involved multiple packages, across multiple releases, with multiple advisories being released (both DSA for stable and old-stable, and DLA for LTS). Markus reviewed patches and security updates prepared by other Debian developers, investigated reported regressions, provided patches for the aforementioned regressions, and issued several security updates as part of this.

    Additionally, as MariaDB 10.3 (the version originally included with Debian buster) passed end-of-life earlier this year, LTS contributor Emilio Pozuelo Monfort has begun investigating the feasibility of backporting MariaDB 10.11. The work is in early stages, with much testing and analysis remaining before a final decision can be made, as this only one of several available potential courses of action concerning MariaDB.

    Finally, LTS contributor Lee Garrett has invested considerable effort into the development the Functional Test Framework here. While so far only an initial version has been published, it already has several features which we intend to begin leveraging for testing of LTS packages. In particular, the FTF supports provisioning multiple VMs for the purposes of performing functional tests of network-facing services (e.g., file services, authentication, etc.). These tests are in addition to the various unit-level tests which are executed during package build time. Development work will continue on FTF and as it matures and begins to see wider use within LTS we expect to improve the quality of the updates we publish.

    Thanks to our sponsors

    Sponsors that joined recently are in bold.

    Categories: FLOSS Project Planets

    Ned Batchelder: Debug helpers in coverage.py

    Planet Python - Sun, 2023-11-12 16:40

    Debugging in the coverage.py code can be difficult, so I’ve written a lot of helper code to support debugging. I just added some more.

    These days I’m working on adding support in coverage.py for sys.monitoring. This is a new feature in Python 3.12 that completely changes how Python reports information about program execution. It’s a big change to coverage.py and it’s a new feature in Python, so while working on it I’ve been confused a lot.

    Some of the confusion has been about how sys.monitoring works, and some was eventually diagnosed as a genuine bug involving sys.monitoring. But all of it started as straight-up “WTF!?” confusion. My preferred debugging approach at times like this is to log a lot of detailed information and then pore over it.

    For something like sys.monitoring where Python is calling my functions and passing code objects, it’s useful to see stack traces for each function call. And because I’m writing large log files it’s useful to be able to tailor the information to the exact details I need so I don’t go cross-eyed trying to find the clues I’m looking for.

    I already had a function for producing compact log-friendly stack traces. For this work, I added more options to it. Now my short_stack function produces one line per frame, with options for which frames to include (it can omit the 20 or so frames of pytest before my own code is involved); whether to show the full file name, or an abbreviated one; and whether to include the id of the frame object:

                         _hookexec : 0x10f23c120 syspath:/pluggy/_manager.py:115
                        _multicall : 0x10f308bc0 syspath:/pluggy/_callers.py:77
                pytest_pyfunc_call : 0x10f356340 syspath:/_pytest/python.py:194
        test_thread_safe_save_data : 0x10e056480 syspath:/tests/test_concurrency.py:674
                         __enter__ : 0x10f1a7e20 syspath:/contextlib.py:137
                           collect : 0x10f1a7d70 cov:/control.py:669
                             start : 0x10f1a7690 cov:/control.py:648
                             start : 0x10f650300 cov:/collector.py:353
                     _start_tracer : 0x10f5c4e80 cov:/collector.py:296
                          __init__ : 0x10e391ee0 cov:/pep669_tracer.py:155
                               log : 0x10f587670 cov:/pep669_tracer.py:55
                       short_stack : 0x10f5c5180 cov:/pep669_tracer.py:93

    Once I had these options implemented in a quick way and they proved useful, I moved the code into coverage.py’s debug.py file and added tests for the new behaviors. This took a while, but in the end I think it was worth it. I don’t need to use these tools often, but when I do, I’m deep in a bad problem and I want to have a well-sharpened tool at hand.

    Writing debug code is like writing tests: it’s just there to support you in development, it’s not part of “the product.” But it’s really helpful. You should do it. It could be something as simple as a custom __repr__ method for your classes to show just the information you need.

    It’s especially helpful when your code deals in specialized domains or abstractions. Your debugging code can speak the same language. Zellij was a small side project of mine to draw geometric art like this:

    When the over-under weaving code wasn’t working right, I added some tooling to get debug output like this:

    I don’t remember what the different colors, patterns, and symbols meant, but at the time it was very helpful for diagnosing what was wrong with the code.

    Categories: FLOSS Project Planets

    #! code: Drupal 10: Running Drupal Tests On GitHub Using Workflows

    Planet Drupal - Sun, 2023-11-12 14:10

    There are a number of different tools that allow you to validate and test a Drupal site. Inspecting your custom code allows you to adhere to coding standards and ensure that you stamp our common coding problems. Adding tests allows you to make certain that the functionality of your Drupal site works correctly.

    If you have tests in your Drupal project then you ideally need to be running them at some point in your development workflow. Getting GitHub to run the tests when you push code or create a pull request means that you can have peace of mind that your test suite is being run at some point in workflow. You also want to allow your tests to be run locally with ease, without having to remember lots of command line arguments.

    In this article I will show how to set up validation and tests against a Drupal site and how to get GitHub to run these steps when you create a pull request. This assumes you have a Drupal 10 project that is controlled via composer.

    Let's start with creating a runner using Makefile.

    Makefile

    A Makefile is an automation tool that allows developers to create a dependency structure of tasks that is then run by using the "make" command. This file format was original developed to assist with compiling complex projects, but it can easily be used to perform any automation script you need.

    For example, let's say that we want to allow a command to be run that has a number of different parameters. This might be a curl command or even an rsync command, where the order of the parameters are absolutely critical. To do this you would create a file called "Makefile" and add the following.

    sync-files: rsync -avzh source/directory destination/directory

    To run this you just need to type "make" followed by the name of the command.

    Read more

    Categories: FLOSS Project Planets

    Lukas Märdian: Netplan brings consistent network configuration across Desktop, Server, Cloud and IoT

    Planet Debian - Sun, 2023-11-12 10:00

    Ubuntu 23.10 “Mantic Minotaur” Desktop, showing network settings

    We released Ubuntu 23.10 ‘Mantic Minotaur’ on 12 October 2023, shipping its proven and trusted network stack based on Netplan. Netplan is the default tool to configure Linux networking on Ubuntu since 2016. In the past, it was primarily used to control the Server and Cloud variants of Ubuntu, while on Desktop systems it would hand over control to NetworkManager. In Ubuntu 23.10 this disparity in how to control the network stack on different Ubuntu platforms was closed by integrating NetworkManager with the underlying Netplan stack.

    Netplan could already be used to describe network connections on Desktop systems managed by NetworkManager. But network connections created or modified through NetworkManager would not be known to Netplan, so it was a one-way street. Activating the bidirectional NetworkManager-Netplan integration allows for any configuration change made through NetworkManager to be propagated back into Netplan. Changes made in Netplan itself will still be visible in NetworkManager, as before. This way, Netplan can be considered the “single source of truth” for network configuration across all variants of Ubuntu, with the network configuration stored in /etc/netplan/, using Netplan’s common and declarative YAML format.

    Netplan Desktop integration

    On workstations, the most common scenario is for users to configure networking through NetworkManager’s graphical interface, instead of driving it through Netplan’s declarative YAML files. Netplan ships a “libnetplan” library that provides an API to access Netplan’s parser and validation internals, which is now used by NetworkManager to store any network interface configuration changes in Netplan. For instance, network configuration defined through NetworkManager’s graphical UI or D-Bus API will be exported to Netplan’s native YAML format in the common location at /etc/netplan/. This way, the only thing administrators need to care about when managing a fleet of Desktop installations is Netplan. Furthermore, programmatic access to all network configuration is now easily accessible to other system components integrating with Netplan, such as snapd. This solution has already been used in more confined environments, such as Ubuntu Core and is now enabled by default on Ubuntu 23.10 Desktop.

    Migration of existing connection profiles

    On installation of the NetworkManager package (network-manager >= 1.44.2-1ubuntu1) in Ubuntu 23.10, all your existing connection profiles from /etc/NetworkManager/system-connections/ will automatically and transparently be migrated to Netplan’s declarative YAML format and stored in its common configuration directory /etc/netplan/. 

    The same migration will happen in the background whenever you add or modify any connection profile through the NetworkManager user interface, integrated with GNOME Shell. From this point on, Netplan will be aware of your entire network configuration and you can query it using its CLI tools, such as “sudo netplan get” or “sudo netplan status” without interrupting traditional NetworkManager workflows (UI, nmcli, nmtui, D-Bus APIs). You can observe this migration on the apt-get command line, watching out for logs like the following:

    Setting up network-manager (1.44.2-1ubuntu1.1) ... Migrating HomeNet (9d087126-ae71-4992-9e0a-18c5ea92a4ed) to /etc/netplan Migrating eduroam (37d643bb-d81d-4186-9402-7b47632c59b1) to /etc/netplan Migrating DebConf (f862be9c-fb06-4c0f-862f-c8e210ca4941) to /etc/netplan

    In order to prepare for a smooth transition, NetworkManager tests were integrated into Netplan’s continuous integration pipeline at the upstream GitHub repository. Furthermore, we implemented a passthrough method of handling unknown or new settings that cannot yet be fully covered by Netplan, making Netplan future-proof for any upcoming NetworkManager release.

    The future of Netplan

    Netplan has established itself as the proven network stack across all variants of Ubuntu – Desktop, Server, Cloud, or Embedded. It has been the default stack across many Ubuntu LTS releases, serving millions of users over the years. With the bidirectional integration between NetworkManager and Netplan the final piece of the puzzle is implemented to consider Netplan the “single source of truth” for network configuration on Ubuntu. With Debian choosing Netplan to be the default network stack for their cloud images, it is also gaining traction outside the Ubuntu ecosystem and growing into the wider open source community.

    Within the development cycle for Ubuntu 24.04 LTS, we will polish the Netplan codebase to be ready for a 1.0 release, coming with certain guarantees on API and ABI stability, so that other distributions and 3rd party integrations can rely on Netplan’s interfaces. First steps into that direction have already been taken, as the Netplan team reached out to the Debian community at DebConf 2023 in Kochi/India to evaluate possible synergies.

    Conclusion

    Netplan can be used transparently to control a workstation’s network configuration and plays hand-in-hand with many desktop environments through its tight integration with NetworkManager. It allows for easy network monitoring, using common graphical interfaces and provides a “single source of truth” to network administrators, allowing for configuration of Ubuntu Desktop fleets in a streamlined and declarative way. You can try this new functionality hands-on by following the “Access Desktop NetworkManager settings through Netplan” tutorial.


    If you want to learn more, feel free to follow our activities on Netplan.io, GitHub, Launchpad, IRC or our Netplan Developer Diaries blog on discourse.

    Categories: FLOSS Project Planets

    death and gravity: reader 3.10 released – storage internal API

    Planet Python - Sun, 2023-11-12 08:42

    Hi there!

    I'm happy to announce version 3.10 of reader, a Python feed reader library.

    What's new? #

    Here are the highlights since reader 3.9.

    Storage internal API #

    The storage internal API is now documented!

    This is important because it opens up reader to using other databases than SQLite.

    The protocols are mostly stable, but some changes are still expected. The long term goal is full stabilization, but at least one other implementation needs to exists before that, to work out any remaining kinks.

    A SQLAlchemy backend would be especially useful, since it would provide access to a variety of database engines mostly out of the box. (Alas, I do not have time nor a need for this at the moment. Interested on working on it? Let me know!)

    Why not use SQLAlchemy from the start? #

    In the beginning:

    • I wanted to keep things as simple as possible, so I stay motivated for the long term. I also wanted to follow a problem-solution approach, which cautions against solving problems you don't have. (Details on both here and here.)
    • By that time, I was already a SQLite fan, and due to the single-user nature of reader, I was relatively confident concurrency won't be an issue.
    • I didn't know exactly where and how I would deploy the web app; sqlite3 being in the standard library made it very appealing.

    Since then, I did come up with some of my own complexity – reader has a query builder and a migration system (albeit both of them tiny), and there were some concurrency issues. SQLAlchemy would have likely helped with the first two, but not with the last. Overall, I still think plain SQLite was the right choice at the time.

    Deprecated sqlite3 datetime support #

    The default sqlite3 datetime adapters/converters were deprecated in Python 3.12. Since adapters/converters apply to all database connections, reader does not have the option of registering its own (as a library, it should not change global stuff), so datetime conversions now happen in the storage. As an upside, this provided an opportunity to change the storage to use timezone-aware datetimes.

    Share experimental plugin #

    There's a new share web app plugin to add social sharing links to the entry page.

    Ideally, this functionality should end up in a plugin that adds them to Entry.links (to be exposed in #320), so all reader users can benefit from it.

    Python versions #

    None this time, but Python 3.12 support is coming soon!

    For more details, see the full changelog.

    That's it for now.

    Want to contribute? Check out the docs and the roadmap.

    Learned something new today? Share this with others, it really helps!

    What is reader? #

    reader takes care of the core functionality required by a feed reader, so you can focus on what makes yours different.

    reader allows you to:

    • retrieve, store, and manage Atom, RSS, and JSON feeds
    • mark articles as read or important
    • add arbitrary tags/metadata to feeds and articles
    • filter feeds and articles
    • full-text search articles
    • get statistics on feed and user activity
    • write plugins to extend its functionality

    ...all these with:

    • a stable, clearly documented API
    • excellent test coverage
    • fully typed Python

    To find out more, check out the GitHub repo and the docs, or give the tutorial a try.

    Why use a feed reader library? #

    Have you been unhappy with existing feed readers and wanted to make your own, but:

    • never knew where to start?
    • it seemed like too much work?
    • you don't like writing backend code?

    Are you already working with feedparser, but:

    • want an easier way to store, filter, sort and search feeds and entries?
    • want to get back type-annotated objects instead of dicts?
    • want to restrict or deny file-system access?
    • want to change the way feeds are retrieved by using Requests?
    • want to also support JSON Feed?
    • want to support custom information sources?

    ... while still supporting all the feed types feedparser does?

    If you answered yes to any of the above, reader can help.

    The reader philosophy #
    • reader is a library
    • reader is for the long term
    • reader is extensible
    • reader is stable (within reason)
    • reader is simple to use; API matters
    • reader features work well together
    • reader is tested
    • reader is documented
    • reader has minimal dependencies
    Why make your own feed reader? #

    So you can:

    • have full control over your data
    • control what features it has or doesn't have
    • decide how much you pay for it
    • make sure it doesn't get closed while you're still using it
    • really, it's easier than you think

    Obviously, this may not be your cup of tea, but if it is, reader can help.

    Categories: FLOSS Project Planets

    Lisandro Damián Nicanor Pérez Meyer: Mini DebConf 2023 in Montevideo, Uruguay

    Planet Debian - Sun, 2023-11-12 06:41

    15 years, "la niña bonita", if you ask many of my fellow argentinians, is the amount of time I haven't been present in any Debian-related face to face activity. It was already time to fix that. Thanks to Santiago Ruano Rincón and Gunnar Wolf that proded me to come I finally attended the Mini DebConf Uruguay in Montevideo.

    I took the opportunity to do my first trip by ferry, which is currently one of the best options to get from Buenos Aires to Montevideo, in my case through Colonia. Living ~700km at the south west of Buenos Aires city the trip was long, it included a 10 hours bus, a ferry and yet another bus... but of course, it was worth it.

    In Buenos Aires' port I met Emmanuel eamanu Arias, a fellow Argentinian Debian Developer from La Rioja, so I had the pleasure to travel with him.

    To be honest Gunnar already did a wonderful blog post with many pictures, I should have taken more.

    I had the opportunity to talk about device trees, and even look at Gunnar's machine one in order to find why a Display Port port was not working on a kernel but did in another. At the same time I also had time to start packaging qt6-grpc. Sadly I was there just one entire day, as I arrived on Thursday afternoon and had to leave on Saturday after lunch, but we did have a lot of quality Debian time.

    I'll repeat here what Gunnar already wrote:

    We had a long, important conversation about an important discussion that we are about to present on debian-vote@lists.debian.org.

    Stay tuned on that, I think this is something we should all get involved.

    All in all I already miss hacking with people on the same room. Meetings for us mean a lot of distance to be traveled (well, I live far away of almost everything), but I really should try to this more often. Certainly more than just once every 15 years :-)

    Categories: FLOSS Project Planets

    Petter Reinholdtsen: New and improved sqlcipher in Debian for accessing Signal database

    Planet Debian - Sun, 2023-11-12 06:00

    For a while now I wanted to have direct access to the Signal database of messages and channels of my Desktop edition of Signal. I prefer the enforced end to end encryption of Signal these days for my communication with friends and family, to increase the level of safety and privacy as well as raising the cost of the mass surveillance government and non-government entities practice these days. In August I came across a nice recipe on how to use sqlcipher to extract statistics from the Signal database explaining how to do this. Unfortunately this did not work with the version of sqlcipher in Debian. The sqlcipher package is a "fork" of the sqlite package with added support for encrypted databases. Sadly the current Debian maintainer announced more than three years ago that he did not have time to maintain sqlcipher, so it seemed unlikely to be upgraded by the maintainer. I was reluctant to take on the job myself, as I have very limited experience maintaining shared libraries in Debian. After waiting and hoping for a few months, I gave up the last week, and set out to update the package. In the process I orphaned it to make it more obvious for the next person looking at it that the package need proper maintenance.

    The version in Debian was around five years old, and quite a lot of changes had taken place upstream into the Debian maintenance git repository. After spending a few days importing the new upstream versions, realising that upstream did not care much for SONAME versioning as I saw library symbols being both added and removed with minor version number changes to the project, I concluded that I had to do a SONAME bump of the library package to avoid surprising the reverse dependencies. I even added a simple autopkgtest script to ensure the package work as intended. Dug deep into the hole of learning shared library maintenance, I set out a few days ago to upload the new version to Debian experimental to see what the quality assurance framework in Debian had to say about the result. The feedback told me the pacakge was not too shabby, and yesterday I uploaded the latest version to Debian unstable. It should enter testing today or tomorrow, perhaps delayed by a small library transition.

    Armed with a new version of sqlcipher, I can now have a look at the SQL database in ~/.config/Signal/sql/db.sqlite. First, one need to fetch the encryption key from the Signal configuration using this simple JSON extraction command:

    /usr/bin/jq -r '."key"' ~/.config/Signal/config.json

    Assuming the result from that command is 'secretkey', which is a hexadecimal number representing the key used to encrypt the database. Next, one can now connect to the database and inject the encryption key for access via SQL to fetch information from the database. Here is an example dumping the database structure:

    % sqlcipher ~/.config/Signal/sql/db.sqlite sqlite> PRAGMA key = "x'secretkey'"; sqlite> .schema CREATE TABLE sqlite_stat1(tbl,idx,stat); CREATE TABLE conversations( id STRING PRIMARY KEY ASC, json TEXT, active_at INTEGER, type STRING, members TEXT, name TEXT, profileName TEXT , profileFamilyName TEXT, profileFullName TEXT, e164 TEXT, serviceId TEXT, groupId TEXT, profileLastFetchedAt INTEGER); CREATE TABLE identityKeys( id STRING PRIMARY KEY ASC, json TEXT ); CREATE TABLE items( id STRING PRIMARY KEY ASC, json TEXT ); CREATE TABLE sessions( id TEXT PRIMARY KEY, conversationId TEXT, json TEXT , ourServiceId STRING, serviceId STRING); CREATE TABLE attachment_downloads( id STRING primary key, timestamp INTEGER, pending INTEGER, json TEXT ); CREATE TABLE sticker_packs( id TEXT PRIMARY KEY, key TEXT NOT NULL, author STRING, coverStickerId INTEGER, createdAt INTEGER, downloadAttempts INTEGER, installedAt INTEGER, lastUsed INTEGER, status STRING, stickerCount INTEGER, title STRING , attemptedStatus STRING, position INTEGER DEFAULT 0 NOT NULL, storageID STRING, storageVersion INTEGER, storageUnknownFields BLOB, storageNeedsSync INTEGER DEFAULT 0 NOT NULL); CREATE TABLE stickers( id INTEGER NOT NULL, packId TEXT NOT NULL, emoji STRING, height INTEGER, isCoverOnly INTEGER, lastUsed INTEGER, path STRING, width INTEGER, PRIMARY KEY (id, packId), CONSTRAINT stickers_fk FOREIGN KEY (packId) REFERENCES sticker_packs(id) ON DELETE CASCADE ); CREATE TABLE sticker_references( messageId STRING, packId TEXT, CONSTRAINT sticker_references_fk FOREIGN KEY(packId) REFERENCES sticker_packs(id) ON DELETE CASCADE ); CREATE TABLE emojis( shortName TEXT PRIMARY KEY, lastUsage INTEGER ); CREATE TABLE messages( rowid INTEGER PRIMARY KEY ASC, id STRING UNIQUE, json TEXT, readStatus INTEGER, expires_at INTEGER, sent_at INTEGER, schemaVersion INTEGER, conversationId STRING, received_at INTEGER, source STRING, hasAttachments INTEGER, hasFileAttachments INTEGER, hasVisualMediaAttachments INTEGER, expireTimer INTEGER, expirationStartTimestamp INTEGER, type STRING, body TEXT, messageTimer INTEGER, messageTimerStart INTEGER, messageTimerExpiresAt INTEGER, isErased INTEGER, isViewOnce INTEGER, sourceServiceId TEXT, serverGuid STRING NULL, sourceDevice INTEGER, storyId STRING, isStory INTEGER GENERATED ALWAYS AS (type IS 'story'), isChangeCreatedByUs INTEGER NOT NULL DEFAULT 0, isTimerChangeFromSync INTEGER GENERATED ALWAYS AS ( json_extract(json, '$.expirationTimerUpdate.fromSync') IS 1 ), seenStatus NUMBER default 0, storyDistributionListId STRING, expiresAt INT GENERATED ALWAYS AS (ifnull( expirationStartTimestamp + (expireTimer * 1000), 9007199254740991 )), shouldAffectActivity INTEGER GENERATED ALWAYS AS ( type IS NULL OR type NOT IN ( 'change-number-notification', 'contact-removed-notification', 'conversation-merge', 'group-v1-migration', 'keychange', 'message-history-unsynced', 'profile-change', 'story', 'universal-timer-notification', 'verified-change' ) ), shouldAffectPreview INTEGER GENERATED ALWAYS AS ( type IS NULL OR type NOT IN ( 'change-number-notification', 'contact-removed-notification', 'conversation-merge', 'group-v1-migration', 'keychange', 'message-history-unsynced', 'profile-change', 'story', 'universal-timer-notification', 'verified-change' ) ), isUserInitiatedMessage INTEGER GENERATED ALWAYS AS ( type IS NULL OR type NOT IN ( 'change-number-notification', 'contact-removed-notification', 'conversation-merge', 'group-v1-migration', 'group-v2-change', 'keychange', 'message-history-unsynced', 'profile-change', 'story', 'universal-timer-notification', 'verified-change' ) ), mentionsMe INTEGER NOT NULL DEFAULT 0, isGroupLeaveEvent INTEGER GENERATED ALWAYS AS ( type IS 'group-v2-change' AND json_array_length(json_extract(json, '$.groupV2Change.details')) IS 1 AND json_extract(json, '$.groupV2Change.details[0].type') IS 'member-remove' AND json_extract(json, '$.groupV2Change.from') IS NOT NULL AND json_extract(json, '$.groupV2Change.from') IS json_extract(json, '$.groupV2Change.details[0].aci') ), isGroupLeaveEventFromOther INTEGER GENERATED ALWAYS AS ( isGroupLeaveEvent IS 1 AND isChangeCreatedByUs IS 0 ), callId TEXT GENERATED ALWAYS AS ( json_extract(json, '$.callId') )); CREATE TABLE sqlite_stat4(tbl,idx,neq,nlt,ndlt,sample); CREATE TABLE jobs( id TEXT PRIMARY KEY, queueType TEXT STRING NOT NULL, timestamp INTEGER NOT NULL, data STRING TEXT ); CREATE TABLE reactions( conversationId STRING, emoji STRING, fromId STRING, messageReceivedAt INTEGER, targetAuthorAci STRING, targetTimestamp INTEGER, unread INTEGER , messageId STRING); CREATE TABLE senderKeys( id TEXT PRIMARY KEY NOT NULL, senderId TEXT NOT NULL, distributionId TEXT NOT NULL, data BLOB NOT NULL, lastUpdatedDate NUMBER NOT NULL ); CREATE TABLE unprocessed( id STRING PRIMARY KEY ASC, timestamp INTEGER, version INTEGER, attempts INTEGER, envelope TEXT, decrypted TEXT, source TEXT, serverTimestamp INTEGER, sourceServiceId STRING , serverGuid STRING NULL, sourceDevice INTEGER, receivedAtCounter INTEGER, urgent INTEGER, story INTEGER); CREATE TABLE sendLogPayloads( id INTEGER PRIMARY KEY ASC, timestamp INTEGER NOT NULL, contentHint INTEGER NOT NULL, proto BLOB NOT NULL , urgent INTEGER, hasPniSignatureMessage INTEGER DEFAULT 0 NOT NULL); CREATE TABLE sendLogRecipients( payloadId INTEGER NOT NULL, recipientServiceId STRING NOT NULL, deviceId INTEGER NOT NULL, PRIMARY KEY (payloadId, recipientServiceId, deviceId), CONSTRAINT sendLogRecipientsForeignKey FOREIGN KEY (payloadId) REFERENCES sendLogPayloads(id) ON DELETE CASCADE ); CREATE TABLE sendLogMessageIds( payloadId INTEGER NOT NULL, messageId STRING NOT NULL, PRIMARY KEY (payloadId, messageId), CONSTRAINT sendLogMessageIdsForeignKey FOREIGN KEY (payloadId) REFERENCES sendLogPayloads(id) ON DELETE CASCADE ); CREATE TABLE preKeys( id STRING PRIMARY KEY ASC, json TEXT , ourServiceId NUMBER GENERATED ALWAYS AS (json_extract(json, '$.ourServiceId'))); CREATE TABLE signedPreKeys( id STRING PRIMARY KEY ASC, json TEXT , ourServiceId NUMBER GENERATED ALWAYS AS (json_extract(json, '$.ourServiceId'))); CREATE TABLE badges( id TEXT PRIMARY KEY, category TEXT NOT NULL, name TEXT NOT NULL, descriptionTemplate TEXT NOT NULL ); CREATE TABLE badgeImageFiles( badgeId TEXT REFERENCES badges(id) ON DELETE CASCADE ON UPDATE CASCADE, 'order' INTEGER NOT NULL, url TEXT NOT NULL, localPath TEXT, theme TEXT NOT NULL ); CREATE TABLE storyReads ( authorId STRING NOT NULL, conversationId STRING NOT NULL, storyId STRING NOT NULL, storyReadDate NUMBER NOT NULL, PRIMARY KEY (authorId, storyId) ); CREATE TABLE storyDistributions( id STRING PRIMARY KEY NOT NULL, name TEXT, senderKeyInfoJson STRING , deletedAtTimestamp INTEGER, allowsReplies INTEGER, isBlockList INTEGER, storageID STRING, storageVersion INTEGER, storageUnknownFields BLOB, storageNeedsSync INTEGER); CREATE TABLE storyDistributionMembers( listId STRING NOT NULL REFERENCES storyDistributions(id) ON DELETE CASCADE ON UPDATE CASCADE, serviceId STRING NOT NULL, PRIMARY KEY (listId, serviceId) ); CREATE TABLE uninstalled_sticker_packs ( id STRING NOT NULL PRIMARY KEY, uninstalledAt NUMBER NOT NULL, storageID STRING, storageVersion NUMBER, storageUnknownFields BLOB, storageNeedsSync INTEGER NOT NULL ); CREATE TABLE groupCallRingCancellations( ringId INTEGER PRIMARY KEY, createdAt INTEGER NOT NULL ); CREATE TABLE IF NOT EXISTS 'messages_fts_data'(id INTEGER PRIMARY KEY, block BLOB); CREATE TABLE IF NOT EXISTS 'messages_fts_idx'(segid, term, pgno, PRIMARY KEY(segid, term)) WITHOUT ROWID; CREATE TABLE IF NOT EXISTS 'messages_fts_content'(id INTEGER PRIMARY KEY, c0); CREATE TABLE IF NOT EXISTS 'messages_fts_docsize'(id INTEGER PRIMARY KEY, sz BLOB); CREATE TABLE IF NOT EXISTS 'messages_fts_config'(k PRIMARY KEY, v) WITHOUT ROWID; CREATE TABLE edited_messages( messageId STRING REFERENCES messages(id) ON DELETE CASCADE, sentAt INTEGER, readStatus INTEGER , conversationId STRING); CREATE TABLE mentions ( messageId REFERENCES messages(id) ON DELETE CASCADE, mentionAci STRING, start INTEGER, length INTEGER ); CREATE TABLE kyberPreKeys( id STRING PRIMARY KEY NOT NULL, json TEXT NOT NULL, ourServiceId NUMBER GENERATED ALWAYS AS (json_extract(json, '$.ourServiceId'))); CREATE TABLE callsHistory ( callId TEXT PRIMARY KEY, peerId TEXT NOT NULL, -- conversation id (legacy) | uuid | groupId | roomId ringerId TEXT DEFAULT NULL, -- ringer uuid mode TEXT NOT NULL, -- enum "Direct" | "Group" type TEXT NOT NULL, -- enum "Audio" | "Video" | "Group" direction TEXT NOT NULL, -- enum "Incoming" | "Outgoing -- Direct: enum "Pending" | "Missed" | "Accepted" | "Deleted" -- Group: enum "GenericGroupCall" | "OutgoingRing" | "Ringing" | "Joined" | "Missed" | "Declined" | "Accepted" | "Deleted" status TEXT NOT NULL, timestamp INTEGER NOT NULL, UNIQUE (callId, peerId) ON CONFLICT FAIL ); [ dropped all indexes to save space in this blog post ] CREATE TRIGGER messages_on_view_once_update AFTER UPDATE ON messages WHEN new.body IS NOT NULL AND new.isViewOnce = 1 BEGIN DELETE FROM messages_fts WHERE rowid = old.rowid; END; CREATE TRIGGER messages_on_insert AFTER INSERT ON messages WHEN new.isViewOnce IS NOT 1 AND new.storyId IS NULL BEGIN INSERT INTO messages_fts (rowid, body) VALUES (new.rowid, new.body); END; CREATE TRIGGER messages_on_delete AFTER DELETE ON messages BEGIN DELETE FROM messages_fts WHERE rowid = old.rowid; DELETE FROM sendLogPayloads WHERE id IN ( SELECT payloadId FROM sendLogMessageIds WHERE messageId = old.id ); DELETE FROM reactions WHERE rowid IN ( SELECT rowid FROM reactions WHERE messageId = old.id ); DELETE FROM storyReads WHERE storyId = old.storyId; END; CREATE VIRTUAL TABLE messages_fts USING fts5( body, tokenize = 'signal_tokenizer' ); CREATE TRIGGER messages_on_update AFTER UPDATE ON messages WHEN (new.body IS NULL OR old.body IS NOT new.body) AND new.isViewOnce IS NOT 1 AND new.storyId IS NULL BEGIN DELETE FROM messages_fts WHERE rowid = old.rowid; INSERT INTO messages_fts (rowid, body) VALUES (new.rowid, new.body); END; CREATE TRIGGER messages_on_insert_insert_mentions AFTER INSERT ON messages BEGIN INSERT INTO mentions (messageId, mentionAci, start, length) SELECT messages.id, bodyRanges.value ->> 'mentionAci' as mentionAci, bodyRanges.value ->> 'start' as start, bodyRanges.value ->> 'length' as length FROM messages, json_each(messages.json ->> 'bodyRanges') as bodyRanges WHERE bodyRanges.value ->> 'mentionAci' IS NOT NULL AND messages.id = new.id; END; CREATE TRIGGER messages_on_update_update_mentions AFTER UPDATE ON messages BEGIN DELETE FROM mentions WHERE messageId = new.id; INSERT INTO mentions (messageId, mentionAci, start, length) SELECT messages.id, bodyRanges.value ->> 'mentionAci' as mentionAci, bodyRanges.value ->> 'start' as start, bodyRanges.value ->> 'length' as length FROM messages, json_each(messages.json ->> 'bodyRanges') as bodyRanges WHERE bodyRanges.value ->> 'mentionAci' IS NOT NULL AND messages.id = new.id; END; sqlite>

    Finally I have the tool needed to inspect and process Signal messages that I need, without using the vendor provided client. Now on to transforming it to a more useful format.

    As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address 15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

    Categories: FLOSS Project Planets

    Daniel Roy Greenfeld: TIL: Autoloading on Filechange using Watchdog

    Planet Python - Sun, 2023-11-12 05:30

    Using Watchdog to monitor changes to a directory so we can alter what we serve out as HTTP. Each segment is the evolution towards getting it to work.

    Serving HTTP from a directory

    I've done this for ages:

    python -m http.server 8000 -d /path/to/files Using http.server in a function

    The Python docs aren't very clear on this, and rather than think hard about it I did this fun hack:

    # cli.py from pathlib import Path from subprocess import check_call def server(site: Path) -> None: check_call(['python', '-m', 'http.server', '8000', '-d', site]) Autoreloading on filechanges

    A bit more involved, and can certainly be improved. Here goes:

    # server.py import functools import http.server import os import socketserver from pathlib import Path from watchdog.events import FileSystemEventHandler from watchdog.observers import Observer def build_handler(directory: Path): """Specify the directory of SimpleHTTPRequestHandler""" return functools.partial(http.server.SimpleHTTPRequestHandler, directory=directory) class EventHandler(FileSystemEventHandler): def on_any_event(self, event): if event.is_directory: return elif event.event_type in ["created", "modified"]: print(f"Reloading server due to file change: {event.src_path}") os._exit(0) def run_server(directory: Path, port: int = 8000): with socketserver.TCPServer(("", port), build_handler(directory)) as httpd: print(f"Serving on port {port}") httpd.serve_forever() def server(directory: Path, port: int = 8000): """Serve files in the watched directory""" # Watch the directory event_handler = EventHandler() observer = Observer() observer.schedule(event_handler, site, recursive=True) observer.start() try: # Run the HTTP server run_server(directory=directory, port=port) except KeyboardInterrupt: observer.stop() observer.join()

    Usage:

    # cli.py def server(site: Path, port: int = 8000): server(site="/path/to/directory/of/html", port=7500)
    Categories: FLOSS Project Planets

    Pages