FLOSS Project Planets

Ivana Isadora Devcic - Why Your Community Needs a Developer Portal - Akademy 2019

Planet KDE - Mon, 2024-04-15 17:24

How can a community like KDE benefit from a developer portal...and what is a developer portal, anyway? This talk aims to answer those questions, and offers practical advice for building a developer portal. The insights from this session can serve as guidance and inspiration to all contributors who want to make sure their community keeps growing and thriving.

Categories: FLOSS Project Planets

The Drop Times: Embracing Technology with Drupal

Planet Drupal - Mon, 2024-04-15 16:24

In 2024, technology has reached new heights, impacting our lives like never before. With AI becoming more common and exciting updates on the horizon, technology is shaping our world significantly.

Technology continues to redefine the way we live, work, and connect. New features, updates, and innovations emerge daily, shaping our world remarkably. At the heart of this transformation lies Drupal, the dynamic content management system that empowers individuals and businesses to navigate the digital realm easily and confidently. Drupal opens limitless possibilities in website development, offering a platform where creativity knows no bounds.

Drupal streamlines the process of creating and managing websites, putting the power of customization and control at our fingertips. With its intuitive interface and robust features, Drupal allows users to craft online experiences that are not just functional but truly exceptional.

Technology is advancing rapidly, and Drupal is here to make it accessible and beneficial for everyone. With its user-friendly features and adaptability, Drupal simplifies website creation and opens doors to endless possibilities.

Let's review the latest news covered by The Drop Times (TDT) last week.

In an Interview with Alka Elizabeth, sub-editor, TDT, Irina Zaks, Co-Founder and CTO at Fibonacci Web Studio, discusses her transition from a background in physics to web development, emphasizing the importance of clear, simple solutions in complex projects. She highlights the Fibonacci sequence's role in her design philosophy, advocating for harmony and simplicity in web development. Zaks also reflects on the unique challenges of academic websites and her efforts to promote open-source software within academia, particularly through her work at the Stanford Open Source Lab. The interview is a precursor to Stanford Web Camp. 

Head to the concluding part of our series, "Drupal Page Builders—Part 4: Distributions," crafted by André Angelantoni, Senior Drupal Architect, HeroDevs. The article discusses the practicalities and considerations of using Drupal distributions for page building. It emphasizes that while distributions can simplify the setup process, especially for non-technical users, they often involve complex maintenance and may slow down upgrades due to dependency on the distribution maintainers. The piece advises conducting a proof-of-concept for the most complex page layouts to ensure the chosen solution can handle the requirements, highlighting that many teams opt for minimal module use to maintain flexibility and control.

Explore the articles curated by Kazima Abbas regarding the featured speakers who presented their sessions at EvolveDrupal Atlanta. The first part features Michael Herchel, Mary Blabaum, Allison Vorthmann, Andy Waldrop, and Jesse Dyck.  Topics range from an Inclusive Design Approach to Drupal 7's End of Life on website security and from leveraging Drupal for scalable software products to insights on migrating to Drupal 10. The second part shifts to topics including The Fourth Decade of Website Deployments, the Crossroads of Website Evolution, and the updates from WCAG 2.1 to 2.2. John Cloys, Penny Kronz, and Steve Persch are featured in this segment. 

The upcoming Drupal Developer Days in Burgas, Bulgaria, from June 26-28, 2024, at Burgas Free University, has announced Suzanne Dergacheva and Frederik W. as keynote speakers. DrupalCamp Iberia 2024 has officially announced its schedule for the upcoming gathering on May 10 and 11 at PACT in Évora, marking a significant event in the Drupal community's annual calendar. The Network of European Drupal Associations (NEDA) has scheduled a meeting on April 24, 2024, led by Esmeralda Tijhoff. This meeting aims to foster connections among Drupal Associations globally, sharing experiences and discussing support strategies.

DrupalCon Portland 2024 extends a special offer to students, recent graduates, and individuals with Drupal Training Certification obtained since 2022, providing an opportunity to purchase tickets for the event at a discounted rate of $50. The DropTimes has been announced as the official media partner for The LagoonCon Portland 2024. DrupalSouth has made their talks from DrupalSouth 2024 accessible online via their YouTube channel! Now, you can catch up on all the insightful discussions and presentations from the comfort of your home. A particularly engaging story narrated at the event is the talk by Dallas Ramsden, a story of human resilience. The Eclipse Foundation has initiated a collaborative effort to establish common cybersecurity standards in response to the upcoming European Union's Cyber Resilience Act. This initiative involves multiple open-source organizations and aims to harmonize secure software development practices across the industry, addressing both regulatory requirements and the broader challenge of securing open-source software. The Greek Drupal Community celebrated a resurgence with the Greece Spring Sprint 2024, where experienced developers across Athens, Thessaloniki, and Patras collaborated to address numerous issues. We have received pictures via Georgios Andreadis of the event, take a look here.

Jesus Manuel Olivas, Co-Founder and CEO at Octahedroid, announced the initial release of two significant Drupal modules—Visual Editor and Decoupled Preview Iframe. A recent update from Drupal.org saw Gábor Hojtsy outlining the forthcoming release plans for Drupal 11, noting significant dependency updates and two potential release windows later this year. Obviously, this has created discussions in the community. Paul Johnson, a long-time participant and photographer at Drupal events, has announced an initiative by the Promote Drupal team to create the first official Promote Drupal Image Library. The team calls for submissions from photographers within the Drupal community who are willing to contribute their work under a Creative Commons license. 

While there are undoubtedly more stories to share, our current constraints may necessitate a temporary pause in our coverage.

To get timely updates, follow us on LinkedIn, Twitter and Facebook. Also, join us on Drupal Slack at #thedroptimes.

Thank you,
Sincerely
Kazima Abbas
Sub-editor, The Drop Times

Categories: FLOSS Project Planets

Ned Batchelder: Try it: function/class coverage report

Planet Python - Mon, 2024-04-15 16:02

I’ve added experimental function and class coverage reports to coverage.py. I’d like feedback about whether they behave the way you want them to.

I haven’t made a PyPI release. To try the new reports, install coverage from GitHub. Be sure to include the hash:

$ python3 -m pip install git+https://github.com/nedbat/coveragepy@f10c455b7c8fd26352de#egg=coverage==0.0

Then run coverage and make an HTML report as you usually do. You should have two new pages, not linked from the index page (yet). “htmlcov/function_index.html” is the function coverage report, and the classes are in “htmlcov/class_index.html”.

I had to decide how to categorize nested functions and classes. Inner functions are not counted as part of their outer functions. Classes consist of the executable lines in their methods, but not lines outside of methods, because those lines run on import. Each file has an entry in the function report for all of the lines outside of any function, called “(no function)”. The class report has “(no class)” entries for lines outside of any classes.

The result should be that every line is part of one function, or the “(no function)” entry, and every line is part of one class, or the “(no class)” entry. This is what made sense to me, but maybe there’s a compelling reason to do it differently.

The reports have a sortable column for the file name, and a sortable column for the function or class. Where functions or classes are nested, the name is a dotted sequence, but is sorted by only the last component. Just like the original file listing page, the new pages can be filtered to focus on areas of interest.

You can look at some sample reports:

It would be helpful if you could give me feedback on the original issue about some questions:

  • Is it useful to have “(no function)” and “(no class)” entries or is it just distracting pedantry? With the entries, the total is the same as the file report, but they don’t seem useful by themselves.
  • Does the handling of nested functions and classes make sense?
  • Should these reports be optional (requested with a switch) or always produced?
  • Is it reasonable to produce one page with every function? How large does a project have to get before that’s not feasible or useful?
  • And most importantly: do these reports help you understand how to improve your code?

This is only in the HTML report for now, but we can do more in the future. Other ideas about improvements are of course welcome. Thanks!

Categories: FLOSS Project Planets

Talking Drupal: Talking Drupal #446 - Test Driven Development

Planet Drupal - Mon, 2024-04-15 14:00

Today we are talking about Test Driven Development, Why it’s important, and How it improves development with guest Alexey Korepov. We’ll also cover Test Helpers as our module of the week.

For show notes visit: www.talkingDrupal.com/446

Topics
  • What does the term Test Driven Development (TDD) mean
  • Does Drupal make use of TDD
  • What makes TDD different from other methods of Development
  • Do you have to change your way of thinking
  • What are some good resources to learn TDD
  • Do you have any pointers for teams looking to get started
  • Are certain kinds of projects better suited to TDD
  • How have dev teams adapted to TDD
  • Any advice on environment setup
  • Any special tools
Resources Guests

Alexey Korepov - korepov.pro Murz

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan Martin Anderson-Clutz - mandclu Matt Glaman - mglaman.dev mglaman

MOTW Correspondent

Martin Anderson-Clutz - mandclu

  • Brief description:
    • Have you ever wanted an API that could dramatically simplify the process of writing Drupal unit tests? There’s a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Sep 2022 by today’s guest, Alexey Korepov
    • Versions available: 1.3.0 compatible with versions of Drupal 9.4 or newer, right up to Drupal 11
  • Maintainership
    • Actively maintained, latest release less than 3 months ago
    • Security coverage
    • Test coverage, would be ironic if it didn’t
    • API Documentation is available, linked from the project page
    • Number of open issues: 2 open issues, which are actually feature requests
  • Usage stats:
    • 5 sites officially, but modules or sites can leverage Test Helpers without enabling it, and this usage is recommended, so the number is actually higher
  • Module features and usage
    • Provides a new container that automated tests can leverage to perform common tasks with much less code.
    • For example, you can create a user or a node with a single line of code
    • You can also mock more complex operations like an entityQuery or loadMultiple call, again with a single line of code
    • Traditionally, writing unit tests is more complicated because by design they run without fully bootstrapping Drupal
    • That means that your test needs to mock functions or services in the code you’re testing which can result in units tests being much longer than the code they’re testing
    • Test Helpers also allows your tests to leverage existing mocks and stubs for popular services
    • The project page also links to the recording and slides for a talk Alexey gave about Test Helpers at DrupalCon Pittsburgh last year, if you want to do a deeper dive
Categories: FLOSS Project Planets

Okular gets interesting new features

Planet KDE - Mon, 2024-04-15 13:31

Next week KDE will release Applications 19.04, overhauling most of the suite of programs usually packaged with KDE's Plasma desktop. This new version makes KDE's applications more user-friendly, consistent with each other and smoother across the board.

Okular gets, among other improvements, support to show and verify digital signatures on PDFs. This feature is essential to use KDE's document viewer for business or if you need to exchange documents in any official capacity... like when negotiating an international peace treaty. Digital signatures allow you to establish the authenticity of a document, whether it has been tampered with or not, and confirm where it has come from and who sent it.

Attributions:

Ye oldie public domain footage and French Reel Victor Military Band Burchenal music available from:
https://archive.org

"Sun Lotion" by Steadman CC By-NC-ND available from:
http://freemusicarchive.org/music/Steadman/The_Bitter_End/08_sun_lotion
(along with many more banging tunes)

Peace treaty footage available from:
Clintonlibrary 42: https://www.youtube.com/channel/UCMkYd9teJEBlCOwDan6qPpg
One News New Zealand: https://www.youtube.com/channel/UCRdbeElPWvSumxERqoR7ZUA
European Commission: https://www.youtube.com/channel/UCMPaviJxybo1RTdzvYcU91A

Many thanks to the content creators that make their work available for free and under generous free licenses. Without your generosity, making videos like this one would not be possible.

Categories: FLOSS Project Planets

The Drop Times: Customer Benefits from Pantheon—Lytics Partnership Explained

Planet Drupal - Mon, 2024-04-15 13:19
Pantheon and Lytics have teamed up to redefine the landscape of website personalization, introducing a solution announced at the Google Cloud Next ‘24 conference. This collaboration offers a transformative approach by leveraging generative AI to provide sophisticated personalization tools, which were previously complex and expensive but are now streamlined and accessible.

By integrating Lytics' Personalization Engine, companies can deploy tailored digital content strategies within 30 days—significantly faster than the traditional months-long process. This partnership simplifies the personalization process and emphasizes using first-party data, aligning with privacy standards, and moving away from reliance on third-party cookies. Embrace this new digital marketing era with Pantheon and Lytics, where advanced technology meets user-centric privacy.
Categories: FLOSS Project Planets

Andreas Rönnquist: Status update for Allegro packaging in Debian

Planet Debian - Mon, 2024-04-15 12:10

I have mailed to a Debian bug on allegro4.4 describing my reasoning
regarding the allegro libraries – in short, allegro4.4 is pretty much
dead upstream, and my interest was basically to keep alex4 (which is
cool) in Debian, but since it migrated to non-free, my interest in
allegro4.4 has waned. So, if anybody would like to still see allegro4.4
in Debian, please step up now and help out. Since it is dead upstream,
my reasoning is that it is better to remove it from Debian if no
maintainer who wants to help steps up.

Previously Tobias Hansen has helped out, but now it is 8 (!) years
since his last upload of either package.(Please don’t interpret this as
judgement, I am very happy for the help he has provided and all the
work he has done on the packages).

Allegro5 is another deal – still active upstream, and I have kept it up
to date in Debian, and while I have held the latest upload a short while
because of the time_t transition, it will come sooner or later – There
I am also waiting on a final decision on this bug from upstream. Other than
that allegro 5 is in a very good state, and I will keep maintaining it
as long as I can. But help would of course be appreciated on allegro5
too.

Categories: FLOSS Project Planets

Real Python: Build a Blog Using Django, GraphQL, and Vue

Planet Python - Mon, 2024-04-15 10:00

Are you a regular Django user? Do you find yourself wanting to decouple your back end and front end? Do you want to handle data persistence in the API while you display the data in a single-page app (SPA) in the browser using a JavaScript framework like React or Vue?

If you answered yes to any of these questions, then you’re in luck. This tutorial will take you through the process of building a Django blog back end and a Vue front end, using GraphQL to communicate between them.

Projects are an effective way to learn and solidify concepts. This tutorial is structured as a step-by-step project so you can learn in a hands-on way and take breaks as needed.

In this tutorial, you’ll learn how to:

  • Translate your Django models into a GraphQL API
  • Run the Django server and a Vue application on your computer at the same time
  • Administer your blog posts in the Django admin
  • Consume a GraphQL API in Vue to show data in the browser

You can download all the source code you’ll use to build your Django blog application by clicking the link below:

Get Your Code: Click here to download the free sample code that you’ll use to build a blog using Django, GraphQL, and Vue.

Demo: A Django Blog Admin, a GraphQL API, and a Vue Front End

Blog applications are a common starter project because they involve create, read, update, and delete (CRUD) operations. In this project, you’ll use the Django admin to do the heavy CRUD lifting and you’ll focus on providing a GraphQL API for your blog data.

You’ll use Vue.js 3 and its composition API for the front end of your blog. Vue lets you create dynamic interfaces pretty smoothly, thanks to its reactive data binding and easy-to-manage components. Plus, since you’re dealing with data from a GraphQL API, you can leverage the Vue Apollo plugin.

Here’s a demonstration of the completed project in action:

Next, you’ll make sure you have all the necessary background information and tools before you dive in and build your blog application.

Project Overview

For this project, you’ll create a small blogging application with some rudimentary features:

  • Authors can write many posts.
  • Posts can have many tags and can be either published or unpublished.

You’ll build the back end of this blog in Django, complete with an admin for adding new blog content. Then you’ll expose the content data as a GraphQL API and use Vue to display that data in the browser.

You’ll accomplish this in several high-level steps. At the end of each step, you’ll find a link to the source code for that stage of the project.

If you’re curious about how the source code for each step looks, then you can click the link below:

Get Your Code: Click here to download the free sample code that you’ll use to build a blog using Django, GraphQL, and Vue.

Prerequisites

You’ll be best equipped for this tutorial if you already have a solid foundation in some web application concepts. You should understand how HTTP requests and responses and APIs work. You can check out Python & APIs: A Winning Combo for Reading Public Data to understand the details of using GraphQL APIs vs REST APIs.

Because you’ll use Django to build the back end for your blog, you’ll want to be familiar with starting a Django project and customizing the Django admin. If you haven’t used Django much before, you might also want to try building another Django-only project first. For a good introduction, check out Get Started with Django Part 1: Build a Portfolio App.

And because you’ll be using Vue on the front end, some experience with JavaScript will also help. If you’ve only used a JavaScript framework like jQuery in the past, the Vue introduction is a good foundation.

Familiarity with JSON is also important because GraphQL queries are JSON-like and return data in JSON format. You can read about Working with JSON Data in Python for an introduction.

Read the full article at https://realpython.com/python-django-blog/ »

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Kdenlive 24.02.2 released

Planet KDE - Mon, 2024-04-15 09:53

The second maintenance release of the 24.02 series is out with performance optimizations when moving clips in the timeline and across multiple project bins, packaging improvements to macOS and Windows versions and fixes to copy/paste of effects, rotoscoping, Nvidia encoding among others.

Full changelog

  • Fix blurry folder icon with some project profiles. Commit.
  • Fix timeline not following playhead. Commit.
  • When copy/paste effects from a group, only paste effects for the active clip. Commit. Fixes bug #421667.
  • Optimize group move (don’t update clip position twice). Commit.
  • Fix nvidia encoding. Commit.
  • Multiple improvements for timeline keyboard grab (don’t test each frame on a move, scoll timeline accordingly, don’t lose focus on app switch). Commit.
  • Update to last commit: only sync shortcuts if there was a change. Commit.
  • Fix: editing toolbar config discards newly set keyboard shortcuts. Commit.
  • Increase Qt6 limit for max image size. Commit. See bug #484752.
  • Fix: Ensure secondary bins have a title bar when needed and that the dock widgets list is always correctly sorted. Commit.
  • Don’t perform bin block twice on main bin. Commit.
  • Fix: lag moving clips from one bin to another and unneeded monitor clip reload. Commit.
  • Fix crash and color theme broken on Windows when opening a project by double click. Commit.
  • Try to fix empty monitor when switching to/from fullscreen on Mac. Commit.
  • Fix mem leak on save. Commit.
  • Add more locks around xml producer, fix autosave triggered on project open. Commit.
  • Mediabrowser: ensure thumbnails are generated after changing the view. Commit.
  • Enable video thumbnails in media browser for Win/Mac. Commit.
  • Fix: don’t propose existing name for new sequence. Commit. Fixes bug #472753.
  • Fix crash in sequence clip thumbnails. Commit. See bug #483836.
  • Ensure we never reset the locale while an MLT XML Consumer is running (it caused data corruption). Commit. See bug #483777.
  • Add icon data to shared-mime-info. Commit.
  • Fix: favorite effects menu not refreshed when a new effect is set as favorite. Commit.
  • Fix: Rotoscoping not allowing to add points close to bottom of the screen. Commit.
  • Fix: Rotoscoping – allow closing shape with Return key, don’t discard initial shape when drawing it and seeking in timeline. Commit. See bug #484009.
  • Fix: cannot translate the “P” for Proxy in timeline. Commit. Fixes bug #471850.
  • Fix white background and blank monitor on Windows after going back from fullscreen. Commit. Fixes bug #484081.
  • Fix wrong KDEInstallDirs on Windows. Commit.
  • Fix recent commit not allowing to open files. Commit.
  • Don’t crash opening aa corrupted project file with no tracks. Commit.
  • Fix: cannot move compositions properly in timeline with Qt6. Commit. Fixes bug #484062.
  • Proxy clip: highlight proxy in file manager when opening the folder. Commit.

The post Kdenlive 24.02.2 released appeared first on Kdenlive.

Categories: FLOSS Project Planets

KDE neon Open Door Chat

Planet KDE - Mon, 2024-04-15 06:16

We’re a few weeks after the KDE 6 Megarelease and while many people have it working well there were too many problems in KDE neon’s rollout.

We’ll be hosting two Open Door Chats on KDE meet tomorrow (Tue 16 April) where users can talk to the developers to talk about any problems you had.

https://meet.kde.org/b/jon-0yw-xqi-sk6 The access code will be posted on the KDE neon Telegram group and Matrix room or e-mail jr@jriddell.org for it

Chats at 09:00 UTC (10:00 BST, 11:00 CEST) and 19:00 UTC (20:00 BST, 21:00 CEST) Tue 16 April 2024

Categories: FLOSS Project Planets

eGenix.com: Python Meeting Düsseldorf - 2024-04-17

Planet Python - Mon, 2024-04-15 04:00

The following text is in German, since we're announcing a regional user group meeting in Düsseldorf, Germany.

Ankündigung

Das nächste Python Meeting Düsseldorf findet an folgendem Termin statt:

17.04.2024, 18:00 Uhr
Raum 1, 2.OG im Bürgerhaus Stadtteilzentrum Bilk
Düsseldorfer Arcaden, Bachstr. 145, 40217 Düsseldorf


Programm Bereits angemeldete Vorträge
  • Marc-André Lemburg:
    Advanced parsing structured data with Python's new match statement
  • Jens Diemer:
    Anbindung von Tinkerforge in Home Assistant
  • Charlie Clark:
    Eine kleine Datenanalyse
  • Detlef Lannert:
    Überblick über CLI-Frameworks
Weitere Vorträge können gerne noch angemeldet werden. Bei Interesse, bitte unter info@pyddf.de melden. Startzeit und Ort

Wir treffen uns um 18:00 Uhr im Bürgerhaus in den Düsseldorfer Arcaden.

Das Bürgerhaus teilt sich den Eingang mit dem Schwimmbad und befindet sich an der Seite der Tiefgarageneinfahrt der Düsseldorfer Arcaden.

Über dem Eingang steht ein großes "Schwimm’ in Bilk" Logo. Hinter der Tür direkt links zu den zwei Aufzügen, dann in den 2. Stock hochfahren. Der Eingang zum Raum 1 liegt direkt links, wenn man aus dem Aufzug kommt.

>>> Eingang in Google Street View

⚠️ Wichtig: Bitte nur dann anmelden, wenn ihr absolut sicher seid, dass ihr auch kommt. Angesichts der begrenzten Anzahl Plätze, haben wir kein Verständnis für kurzfristige Absagen oder No-Shows. Einleitung

Das Python Meeting Düsseldorf ist eine regelmäßige Veranstaltung in Düsseldorf, die sich an Python Begeisterte aus der Region wendet.

Einen guten Überblick über die Vorträge bietet unser PyDDF YouTube-Kanal, auf dem wir Videos der Vorträge nach den Meetings veröffentlichen.

Veranstaltet wird das Meeting von der eGenix.com GmbH, Langenfeld, in Zusammenarbeit mit Clark Consulting & Research, Düsseldorf:

Format

Das Python Meeting Düsseldorf nutzt eine Mischung aus (Lightning) Talks und offener Diskussion.

Vorträge können vorher angemeldet werden, oder auch spontan während des Treffens eingebracht werden. Ein Beamer mit HDMI und FullHD Auflösung steht zur Verfügung.

(Lightning) Talk Anmeldung bitte formlos per EMail an info@pyddf.de

Kostenbeteiligung

Das Python Meeting Düsseldorf wird von Python Nutzern für Python Nutzer veranstaltet.

Da Tagungsraum, Beamer, Internet und Getränke Kosten produzieren, bitten wir die Teilnehmer um einen Beitrag in Höhe von EUR 10,00 inkl. 19% Mwst. Schüler und Studenten zahlen EUR 5,00 inkl. 19% Mwst.

Wir möchten alle Teilnehmer bitten, den Betrag in bar mitzubringen.

Anmeldung

Da wir nur 25 Personen in dem angemieteten Raum empfangen können, möchten wir bitten, sich vorher anzumelden.

Meeting Anmeldung bitte per Meetup

Weitere Informationen

Weitere Informationen finden Sie auf der Webseite des Meetings:

              https://pyddf.de/

Viel Spaß !

Marc-Andre Lemburg, eGenix.com

Categories: FLOSS Project Planets

Zato Blog: Service-oriented API task scheduling

Planet Python - Mon, 2024-04-15 04:00
Service-oriented API task scheduling 2024-04-15, by Dariusz Suchojad

An integral part of Zato, its scalable, service-oriented scheduler makes it is possible to execute high-level API integration processes as background tasks. The scheduler runs periodic jobs which in turn trigger services and services are what is used to integrate systems.

Integration process

In this article we will check how to use the scheduler with three kinds of jobs, one-time, interval-based and Cron-style ones.

What we want to achieve is a sample yet fairly common use-case:

  • Periodically consult a remote REST endpoint for new data
  • Store data found in Redis
  • Push data found as an e-mail attachment

Instead of, or in addition to, Redis or e-mail, we could use SQL and SMS, or MongoDB and AMQP or anything else - Redis and e-mail are just example technologies frequently used in data synchronisation processes that we use to highlight the workings of the scheduler.

No matter the input and output channels, the scheduler works always the same - a definition of a job is created and the job's underlying service is invoked according to the schedule. It is then up to the service to perform all the actions required in a given integration process.

Python code

Our integration service will read as below:

# -*- coding: utf-8 -*- # Zato from zato.common.api import SMTPMessage from zato.server.service import Service class SyncData(Service): name = 'api.scheduler.sync' def handle(self): # Which REST outgoing connection to use rest_out_name = 'My Data Source' # Which SMTP connection to send an email through smtp_out_name = 'My SMTP' # Who the recipient of the email will be smtp_to = 'hello@example.com' # Who to put on CC smtp_cc = 'hello.cc@example.com' # Now, let's get the new data from a remote endpoint .. # .. get a REST connection by name .. rest_conn = self.out.plain_http[rest_out_name].conn # .. download newest data .. data = rest_conn.get(self.cid).text # .. construct a new e-mail message .. message = SMTPMessage() message.subject = 'New data' message.body = 'Check attached data' # .. add recipients .. message.to = smtp_to message.cc = smtp_cc # .. attach the new data to the message .. message.attach('my.data.txt', data) # .. get an SMTP connection by name .. smtp_conn = self.email.smtp[smtp_out_name].conn # .. send the e-mail message with newest data .. smtp_conn.send(message) # .. and now store the data in Redis. self.kvdb.conn.set('newest.data', data)

Now, we just need to make it run periodically in background.

Mind the timezone

In the next steps, we will use the Zato Dashboard to configure new jobs for the scheduler.

Keep it mind that any date and time that you enter in web-admin is always interepreted to be in your web-admin user's timezone and this applies to the scheduler too - by default the timezone is UTC. You can change it by clicking Settings and picking the right timezone to make sure that the scheduled jobs run as expected.

It does not matter what timezone your Zato servers are in - they may be in different ones than the user that is configuring the jobs.

Endpoint definitions

First, let's use web-admin to define the endpoints that the service uses. Note that Redis does not need an explicit declaration because it is always available under "self.kvdb" in each service.

  • Configuring outgoing REST APIs

  • Configuring SMTP e-mail

Now, we can move on to the actual scheduler jobs.

Three types of jobs

To cover different integration needs, three types of jobs are available:

  • One-time - fires once only at a specific date and time and then never runs again
  • Interval-based - for periodic processes, can use any combination of weeks, days, hours, minutes and seconds for the interval
  • Cron-style - similar to interval-based but uses the syntax of Cron for its configuration

One-time

Select one-time if the job should not be repeated after it runs once.

Interval-based

Select interval-based if the job should be repeated periodically. Note that such a job will by default run indefinitely but you can also specify after how many times it should stop, letting you to express concepts such as "Execute once per hour but for the next seven days".

Cron-style

Select cron-style if you are already familiar with the syntax of Cron or if you have some Cron tasks that you would like to migrate to Zato.

Running jobs manually

At times, it is convenient to run a job on demand, no matter what its schedule is and regardless of what type a particular job is. Web-admin lets you always execute a job directly. Simply find the job in the listing, click "Execute" and it will run immediately.

Extra context

It is very often useful to provide additional context data to a service that the scheduler runs - to achieve it, simply enter any arbitrary value in the "Extra" field when creating or an editing a job in web-admin.

Afterwards, that information will be available as self.request.raw_request in the service's handle method.

Reusability

There is nothing else required - all is done and the service will run in accordance with a job's schedule.

Yet, before concluding, observe that our integration service is completely reusable - there is nothing scheduler-specific in it despite the fact that we currently run it from the scheduler.

We could now invoke the service from command line. Or we could mount it on a REST, AMQP, WebSocket or trigger it from any other channel - exactly the same Python code will run in exactly the same fashion, without any new programming effort needed.

More blog posts
Categories: FLOSS Project Planets

GNU Taler news: GNU Taler v0.10 released

GNU Planet! - Sun, 2024-04-14 18:00
We are happy to announce the release of GNU Taler v0.10.
Categories: FLOSS Project Planets

Improvements to QTextDocument

Planet KDE - Sun, 2024-04-14 16:00

uses QTextDocument for it’s WYSIWYG text editor. This is surpringly quite powerful and thanks to some code borrowed from KMail rich text editor, it wasn’t hard to implement huge part of the markdown specification.

But while QTextDocument is great, I hit quickly some limits. This is why I started fixing some of them and I already have some patches up for review in Qt.

Default table style

By default the style of the tables looks straight from the 90s, I submitted a patch to use something a bit nicer: https://codereview.qt-project.org/c/qt/qtbase/+/554132

Old style

New style

This is easy to change in the app itself and this is already the case in Marknote and NeoChat table rendering, but by default I believe Qt should provides a nice style so that apps don’t need to figure out how to overwrite the default style.

While working on this, I also noticed that the border-collapse property was not supported by the layout engine in QtQuick and only in QtWidget. This resulted in the border of the tables to be twice as thick as they should be. This was fortunately fixed by a simple patch: https://codereview.qt-project.org/c/qt/qtdeclarative/+/554151

Responsive images

A width and an height can be assigned to an image in a QTextDocument by using the respectives html attributes <img height="500" width="500" \>.This works correctly when the window size is known and static but less so then the window can be resized as will be the user. A common technique in web design is to add the following styles to the website:

img { max-width: 100%; }

And while QTextDocument support some CSS properties, max-width wasn’t implemented. Adding the support for more properties in the HTML and CSS parser is quite simple as it was just forwarding the value of the max-width attribute to the QTextImageHandler, and a bit complicated was the handling of the % unit as previously only px and em was supported by Qt.

This resulted in two more patches, one for QtTextDocument and QtWidget and one for QtQuick.

Future

Hopefully these patches will be reviewed and merged soon. Afterward I want to add support for a few more CSS properties like: border-radius for images and some other elements, as well the border-{width, style, color} properties for paragraphs to better support <blockquote /> (important when displaying emails).

Categories: FLOSS Project Planets

#! code: Drupal 10: Adding Extra User Account Protection

Planet Drupal - Sun, 2024-04-14 14:01

One of Drupal's strengths is its ability to create communities of users who contribute towards the content of the site. Whether you have an open forum, where users can create their own accounts, or a closed magazine site with just a few editors, you need to take the security of your users seriously.

Out of the box, Drupal has a number of account protection features that assist in making sure that users are authenticated correctly.

For example, the user login page is protected by a brute force system and will lock accounts after a number of incorrect password attempts in a short amount of time.

There are a few other things you can do to protect your site users that can be applied to any Drupal site. In this article we'll look through a number of different modules and techniques you can use to protect the user accounts on your site. We'll look at some of the pros and cons of each approach.

Flood Control

Drupal's login forms have built in brute force projection that will block any user account that fails to enter the correct password more than 5 times per IP address within an hour. This prevents automated bots from just guessing the password of a user account thousands of times until it hits the right combination.

The Flood Control module allows these settings to be tweaked to make them more (or less) restrictive.

Read more

Categories: FLOSS Project Planets

Kate on all Platforms - 2024

Planet KDE - Sun, 2024-04-14 13:30
Unix like systems with X11 or Wayland

All Unix like systems with either X11 or Wayland are well supported since ever.

Linux with X11 and now Wayland is for a long time the primary system on that Kate work happens.

Over the years it was, like most of the KDE applications, ported to various BSD variants.

Be it some mainstream Linux distribution like Fedora or a niche one like NixOS, Kate is available as binary package. You love BSD? From FreeBSD to OpenBSD, you can get a Kate package via your normal package system.

And in the normal case, you can just build it from source on your own, all needed patches should be in our repositories upstream. If that is not the case for your system, please help to upstream them.

Below the current state of the master branch compiled on NixOS unstable with Wayland.

How to compile Kate on your own on a Unix like system and start to help to develop it can be found out here.

Windows

Since several years there are activities in the KDE community to provide our libraries and applications for Windows.

Even if that is a non-free platform, we can reach out to new users and developers that might later be then even interested to switch a full open platform.

Progress is slow, but steady. We have Kate and some other applications in the official Windows Store and nightly build for more of them. With reasonable effort you can develop Kate on Windows with Craft.

Below the current state of the master branch running on Windows 11 inside VirtualBox.

If you like to try that, use the nighly installer linked on the Kate website.

macOS

Beside Windows, the major other non-free platform Kate tries to support is macOS.

We have nighly build available for that and you can, like on Windows, develop Kate with the help of Craft.

Below the current state of the master branch running native on my M2 ARM Mac Mini.

Same as for Windows, if you like to try that, use the nighly installer for either ARM or Intel Macs linked on the Kate website.

Other Platforms

Naturally there are more than the above mentioned operating systems around.

Beside the mobile ones like Android and iOS that are not that interesting for Kate, many other desktop operating systems exist.

Even if the Kate team itself doesn’t put active work it them, that doesn’t mean Kate can’t run there.

Without any active work on our side, for example a Kate port for Haiku was done. Some one-liner patches for that got even upstreamed.

If you work on some port of our stuff and need to upstream stuff, please contact us. Even if you work on a non-mainstream system, as long as the patches are not too intrusive, we are interested to have them.

Help us!

Naturally the most of our developers are working on the Linux or some BSD.

That means the other systems are always in need of more people to help out, both on the programming and testing side.

For Kate, testing should be easy, grab a nighly build for Windows or macOS on the Kate website. Or even better, get Craft running, that will make it easier to contribute, too.

One recent topic that needs love is the removal of DBus for Windows/macOS/Android and other systems that don’t use it normally.

If you are up to help with that, here that is coordinated. The current state is already sufficient that the nightly builds of Kate no longer hang on e.g. macOS, but still some frameworks like KIO will need more work.

Just don’t get that wrong, DBus is great on the Linux or BSD systems that use it natively, but it is a pain on systems that have no notion of DBus and leads there to hangs or the spawning of unwanted processes. Beside that, the usefulness is low there, as there are no services on the bus to communicate with anyways.

Categories: FLOSS Project Planets

Petter Reinholdtsen: Time to move orphaned Debian packages to git

Planet Debian - Sun, 2024-04-14 03:30

There are several packages in Debian without a associated git repository with the packaging history. This is unfortunate and it would be nice if more of these would do so. Quote a lot of these are without a maintainer, ie listed as maintained by the 'Debian QA Group' place holder. In fact, 438 packages have this property according to UDD (SELECT source FROM sources WHERE release = 'sid' AND (vcs_url ilike '%anonscm.debian.org%' OR vcs_browser ilike '%anonscm.debian.org%' or vcs_url IS NULL OR vcs_browser IS NULL) AND maintainer ilike '%packages@qa.debian.org%';). Such packages can be updated without much coordination by any Debian developer, as they are considered orphaned.

To try to improve the situation and reduce the number of packages without associated git repository, I started a few days ago to search out candiates and provide them with a git repository under the 'debian' collaborative Salsa project. I started with the packages pointing to obsolete Alioth git repositories, and am now working my way across the ones completely without git references. In addition to updating the Vcs-* debian/control fields, I try to update Standards-Version, debhelper compat level, simplify d/rules, switch to Rules-Requires-Root: no and fix lintian issues reported. I only implement those that are trivial to fix, to avoid spending too much time on each orphaned package. So far my experience is that it take aproximately 20 minutes to convert a package without any git references, and a lot more for packages with existing git repositories incompatible with git-buildpackages.

So far I have converted 10 packages, and I will keep going until I run out of steam. As should be clear from the numbers, there is enough packages remaining for more people to do the same without stepping on each others toes. I find it useful to start by searching for a git repo already on salsa, as I find that some times a git repo has already been created, but no new version is uploaded to Debian yet. In those cases I start with the existing git repository. I convert to the git-buildpackage+pristine-tar workflow, and ensure a debian/gbp.conf file with "pristine-tar=True" is added early, to avoid uploading a orig.tar.gz with the wrong checksum by mistake. Did that three times in the begin before I remembered my mistake.

So, if you are a Debian Developer and got some spare time, perhaps considering migrating some orphaned packages to git?

As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address 15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

Categories: FLOSS Project Planets

Mario Hernandez: Building a modern Drupal theme with Storybook

Planet Drupal - Sat, 2024-04-13 20:00

Building a custom Drupal theme nowadays is a more complex process than it used to be. Most themes require some kind of build tool such as Gulp, Grunt, Webpack or others to automate many of the repeatitive tasks we perform when working on the front-end. Tasks like compiling and minifying code, compressing images, linting code, and many more. As Atomic Web Design became a thing, things got more complicated because now if you are building components you need a styleguide or Design System to showcase and maintain those components. One of those design systems for me has been Patternlab. I started using Patternlab in all my Drupal projects almost ten years ago with great success. In addition, Patternlab has been the design system of choice at my place of work but one of my immediate tasks was to work on migrating to a different design system. We have a small team but were very excited about the challenge of finding and using a more modern and robust design system for our large multi-site Drupal environment.

Disclaimer: Due to technical restrictions and project requirements, we were not able to use Single Directory Components or SDC, nor the Storybook module. Our front-end environment was custom built from the ground up.

Enter Storybook

After looking a various options for a design system, Storybook seemed to be the right choice for us for a couple of reasons: one, it has been around for about 10 years and during this time it has matured significantly, and two, it has become a very popular option in the Drupal ecosystem. In some ways, Storybook follows the same model as Drupal, it has a pretty active community and a very healthy ecosystem of plugins to extend its core functionality.

Storybook looks very promising as a design system for Drupal projects and with the recent release of Single Directory Components or SDC, and the new Storybook module, we think things can only get better for Drupal front-end development. Unfortunately for us, technical limitations in combination with our specific requirements, prevented us from using SDC or the Storybook module. Instead, we built our environment from scratch with a stand-alone integration of Storybook 8.

Our process and requirements

In choosing Storybook, we went through a rigorous research and testing process to ensure it will not only solve our immediate problems with our current environment, but it will be around as a long term solution. As part of this process, we also tested several available options like Emulsify and Gesso which would be great options for anyone looking for a ready-to-go system out of the box. Some of our requirements included:

1. No components refactoring

The first and non-negotiable requirement was to be able to migrate components from Patternlab to a new design system with the least amount of refactoring as possible. We have a decent amount of components which have been built within the last year and the last thing we wanted was to have to rebuild them again because we are switching design system.

2. A new Front-end build workflow

I personally have been faithful to Gulp as a front-end build tool for as long as I can remember because it did everything I needed done in a very efficient manner. The Drupal project we maintain also used Gulp, but as part of this migration, we wanted to see what other options were out there that could improve our workflow. The obvious choice seemed to be Webpack, but as we looked closer into this we learned about ViteJS, "The Next Genration Frontend Tooling". Vite delivers on its promise of being "blazing fast", and its ecosystem is great and growing, so we went with it.

3. No more Sass in favor of PostCSS

CSS has drastically improved in recent years. It is now possible with plain CSS, to do many of the things you used to be able to only do with Sass or similar CSS Preprocessor. Eliminating Sass from our workflow meant we would also be able to get rid of many other node dependencies related to Sass. The goal for this project was to use plain CSS in combination with PostCSS and one bonus of using Vite is that Vite offers PostCSS processing out of the box without additional plugins or dependencies. Ofcourse if you want to do more advance PostCSS processing you will probably need some external dependencies.

Building a new Drupal theme with Storybook

Let's go over the steps to building the base of your new Drupal theme with ViteJS and Storybook. This will be at a high-level to callout only the most important and Drupal-related parts. This process will create a brand new theme. If you already have a theme you would like to use, make the appropriate changes to the instructions.

1. Setup Storybook with ViteJS ViteJS
  • In your Drupal project, navigate to the theme's directory (i.e. /web/themes/custom/)
  • Run the following command:
npm create vite@latest my_theme # replace my_theme with a name of your choice.
  • When prompted, select the framework of your choice, for us the framework is React.
  • When prompted, select the variant for your project, for us this is JavaScript

After the setup finishes you will have a basic Vite project running.

Storybook
  • Be sure your system is running NodeJS version 18 or higher
  • Inside the newly created theme, run this command:
npx storybook@latest init --type react
  • After installation completes, you will have a new Storybook instance running
  • If Storybook didn't start on its own, start it by running:
npm run storybook TwigJS

Twig templates are server-side templates which are normally rendered with TwigPHP to HTML by Drupal, but Storybook is a JS tool. TwigJS is the JS-equivalent of TwigPHP so that Storybook understands Twig. Let's install all dependencies needed for Storybook to work with Twig.

  • If Storybook is still running, press Ctrl + C to stop it
  • Then run the following command:
npm i -D vite-plugin-twig-drupal html-react-parser twig-drupal-filters @modyfi/vite-plugin-yaml
  • vite-plugin-twig-drupal: If you are using Vite like we are, this is a Vite plugin that handles transforming twig files into a Javascript function that can be used with Storybook. This plugin includes the following:
    • Twig or TwigJS: This is the JavaScript implementation of the Twig PHP templating language. This allows Storybook to understand Twig.
      Note: TwigJS may not always be in sync with the version of Twig PHP in Drupal and you may run into issues when using certain Twig functions or filters, however, we are adding other extensions that may help with the incompatability issues.
    • drupal attribute: Adds the ability to work with Drupal attributes.
  • twig-drupal-filters: TwigJS implementation of Twig functions and filters.
  • html-react-parser: This extension is key for Storybook to parse HTML code into react elements.
  • @modifi/vite-plugin-yaml: Transforms a YAML file into a JS object. This is useful for passing the component's data to React as args.
ViteJS configuration

Update your vite.config.js so it makes use of the new extensions we just installed as well as configuring the namesapces for our components.

import { defineConfig } from "vite" import yml from '@modyfi/vite-plugin-yaml'; import twig from 'vite-plugin-twig-drupal'; import { join } from "node:path" export default defineConfig({ plugins: [ twig({ namespaces: { components: join(__dirname, "./src/components"), // Other namespaces maybe be added. }, }), // Allows Storybook to read data from YAML files. yml(), ], }) Storybook configuration

Out of the box, Storybook comes with main.js and preview.js inside the .storybook directory. These two files is where a lot of Storybook's configuration is done. We are going to define the location of our components, same location as we did in vite.config.js above (we'll create this directory shortly). We are also going to do a quick config inside preview.js for handling drupal filters.

  • Inside .storybook/main.js file, update the stories array as follows:
stories: [ "../src/components/**/*.mdx", "../src/components/**/*.stories.@(js|jsx|mjs|ts|tsx)", ],
  • Inside .storybook/preview.js, update it as follows:
/** @type { import('@storybook/react').Preview } */ import Twig from 'twig'; import drupalFilters from 'twig-drupal-filters'; function setupFilters(twig) { twig.cache(); drupalFilters(twig); return twig; } setupFilters(Twig); const preview = { parameters: { controls: { matchers: { color: /(background|color)$/i, date: /Date$/i, }, }, }, }; export default preview; Creating the components directory
  • If Storybook is still running, press Ctrl + C to stop it
  • Inside the src directory, create the components directory. Alternatively, you could rename the existing stories directory to components.
Creating your first component

With the current system in place we can start building components. We'll start with a very simple component to try things out first.

  • Inside src/components, create a new directory called title
  • Inside the title directory, create the following files: title.yml and title.twig
Writing the code
  • Inside title.yml, add the following:
--- level: 2 modifier: 'title' text: 'Welcome to your new Drupal theme with Storybook!' url: 'https://mariohernandez.io'
  • Inside title.twig, add the following:
<h{{ level|default(2) }}{% if modifier %} class="{{ modifier }}"{% endif %}> {% if url %} <a href="{{ url }}">{{ text }}</a> {% else %} <span>{{ text }}</span> {% endif %} </h{{ level|default(2) }}>

We have a simple title component that will print a title of anything you want. The level key allows us to change the heading level of the title (i.e. h1, h2, h3, etc.), and the modifier key allows us to pass a modifier class to the component, and the url will be helpful when our title needs to be a link to another page or component.

Currently the title component is not available in storybook. Storybook uses a special file to display each component as a story, the file name is component-name.stories.jsx.

  • Inside title create a file called title.stories.jsx
  • Inside the stories file, add the following:
/** * First we import the `html-react-parser` extension to be able to * parse HTML into react. */ import parse from 'html-react-parser'; /** * Next we import the component's markup and logic (twig), data schema (yml), * as well as any styles or JS the component may use. */ import title from './title.twig'; import data from './title.yml'; /** * Next we define a default configuration for the component to use. * These settings will be inherited by all stories of the component, * shall the component have multiple variations. * `component` is an arbitrary name assigned to the default configuration. * `title` determines the location and name of the story in Storybook's sidebar. * `render` uses the parser extension to render the component's html to react. * `args` uses the variables defined in title.yml as react arguments. */ const component = { title: 'Components/Title', render: (args) => parse(title(args)), args: { ...data }, }; /** * Export the Title and render it in Storybook as a Story. * The `name` key allows you to assign a name to each story of the component. * For example: `Title`, `Title dark`, `Title light`, etc. */ export const TitleElement = { name: 'Title', }; /** * Finally export the default object, `component`. Storybook/React requires this step. */ export default component;
  • If Storybook is running you should see the title story. See example below:
  • Otherwise start Storybook by running:
npm run storybook

With Storybook running, the title component should look like the image below:


The controls highlighted at the bottom of the title allow you to change the values of each of the fields for the title.

I wanted to start with the simplest of components, the title, to show how Storybook, with help from the extensions we installed, understands Twig. The good news is that the same approach we took with the title component works on even more complex components. Even the React code we wrote does not change much on large components.

In the next blog post, we will build more components that nest smaller components, and we will also add Drupal related parts and configuration to our theme so we can begin using the theme in a Drupal site. Finally, we will integrate the components we built in Storybook with Drupal so our content can be rendered using the component we're building. Stay tuned. For now, if you want to grab a copy of all the code in this post, you can do so below.

Download the code

Resources In closing

Getting to this point was a team effort and I'd like to thank Chaz Chumley, a Senior Software Engineer, who did a lot of the configuration discussed in this post. In addition, I am thankful to the Emulsify and Gesso teams for letting us pick their brains during our research. Their help was critical in this process.

I hope this was helpful and if there is anything I can help you with in your journey of a Storybook-friendly Drupal theme, feel free to reach out.

Categories: FLOSS Project Planets

Dropsolid Experience Agency: Ready for **Drupal 10**?

Planet Drupal - Sat, 2024-04-13 14:24
December 14 is the release of Drupal 10. By switching to Drupal Core Release Cycle, the impact of the upgrade is not huge. Be prepared anyhow!
Categories: FLOSS Project Planets

Dropsolid Experience Agency: Why Drupal is a Shark

Planet Drupal - Sat, 2024-04-13 14:24
Drupal has been able to adapt and stay the top choice for open enterprise CMS, especially as the key component in the modern digital experience platform (DXP).
Categories: FLOSS Project Planets

Pages