Feeds

Luca Saiu: Languages and complexity, Part I: why I love Anki

GNU Planet! - Tue, 2024-01-02 13:47
Lately I have not been as active in GNU (https://www.gnu.org) as I would have liked—which I plan to change. Apart from work I was busy with happy family life next to E.; and, I guess, with contemplating the dismal state of the West as it descends further and further into tyranny amid the general indifference. Maybe in part seeking solace from the news I focused with renewed intensity on my hobby, studying the Russian language for no reason much more practical than my love for Nineteen-Century novels. I have heard more than one Russian teacher vocally disapproving of literature as ... [Read more]
Categories: FLOSS Project Planets

Hynek Schlawack: How to Ditch Codecov for Python Projects

Planet Python - Tue, 2024-01-02 11:39

Codecov’s unreliability breaking CI on my open source projects has been a constant source of frustration for me for years. I have found a way to enforce coverage over a whole GitHub Actions build matrix that doesn’t rely on third-party services.

Categories: FLOSS Project Planets

Chapter Three: 15 Tips for Writing Better Web Copy

Planet Drupal - Tue, 2024-01-02 11:12
If you’re reading this blog post, something had to bring you here as opposed to the other roughly 600 million blogs out there. Perhaps you’re a regular visitor to this site. Maybe you Googled “writing tips for better web copy” and this showed up. Or you might be one of our social media followers and liked the look of this title. Search engines are a lot smarter than they were ten years ago. Gone are the days when you could “cheat” the system by front-loading your text with a bunch of keywords. Today’s SEO reads for quality of content as well as keywords. But once you’ve got a web user’s attention, you want to keep it — and hopefully encourage them to visit other pages of your site and take whatever actions you want them to take. Again, quality web content is key.
Categories: FLOSS Project Planets

Thomas Koch: Good things come ... state folder

Planet Debian - Tue, 2024-01-02 11:05
Posted on January 2, 2024 Tags: debian, free software, life

Just a little while ago (10 years) I proposed the addition of a state folder to the XDG basedir specification and expanded the article XDGBaseDirectorySpecification in the Debian wiki. Recently I learned, that version 0.8 (from May 2021) of the spec finally includes a state folder.

Granted, I wasn’t the first to have this idea (2009), nor the one who actually made it happen.

Now, please go ahead and use it! Thank you.

Categories: FLOSS Project Planets

Thomas Koch: Know your tools - simple backup with rsync

Planet Debian - Tue, 2024-01-02 11:05
Posted on June 9, 2022 Tags: debian, free software

I’ve been using rsync for years and still did not know its full powers. I just wanted a quick and dirty simple backup but realised that rsnapshot is not in Debian anymore.

However you can do much of rsnapshot with rsync alone nowadays.

The --link-dest option (manpage) solves the part of creating hardlinks to a previous backup (found here). So my backup program becomes this shell script in ~/backups/backup.sh:

#!/bin/sh SERVER="${1}" BACKUP="${HOME}/backups/${SERVER}" SNAPSHOTS="${BACKUP}/snapshots" FOLDER=$(date --utc +%F_%H-%M-%S) DEST="${SNAPSHOTS}/${FOLDER}" LAST=$(ls -d1 ${SNAPSHOTS}/????-??-??_??-??-??|tail -n 1) rsync \ --rsh="ssh -i ${BACKUP}/sshkey -o ControlPath=none -o ForwardAgent=no" \ -rlpt \ --delete --link-dest="${LAST}" \ ${SERVER}::backup "${DEST}"

The script connects to rsync in daemon mode as outlined in section “USING RSYNC-DAEMON FEATURES VIA A REMOTE-SHELL CONNECTION” in the rsync manpage. This allows to reference a “module” as the source that is defined on the server side as follows:

[backup] path = / read only = true exclude from = /srv/rsyncbackup/excludelist uid = root gid = root

The important bit is the read only setting that protects the server against somebody with access to the ssh key to overwrit files on the server via rsync and thus gaining full root access.

Finally the command prefix in ~/.ssh/authorized_keys runs rsync as daemon with sudo and the specified config file:

command="sudo rsync --config=/srv/rsyncbackup/config --server --daemon ."

The sudo setup is left as an exercise for the reader as mine is rather opinionated.

Unfortunately I have not managed to configure systemd timers in the way I wanted and therefor opened an issue: “Allow retry of timer triggered oneshot services with failed conditions or asserts”. Any help there is welcome!

Categories: FLOSS Project Planets

Thomas Koch: Missing memegen

Planet Debian - Tue, 2024-01-02 11:05
Posted on May 1, 2022 Tags: debian, free software, life

Back at $COMPANY we had an internal meme-site. I had some reputation in my team for creating good memes. When I watched Episode 3 of Season 2 from Yes Premier Minister yesterday, I really missed a place to post memes.

This is the full scene. Please watch it or even the full episode before scrolling down to the GIFs. I had a good laugh for some time.

With Debian, I could just download the episode from somewhere on the net with youtube-dl and easily create two GIFs using ffmpeg, with and without subtitle:

ffmpeg -ss 0:5:59.600 -to 0:6:11.150 -i Downloads/Yes.Prime.Minister.S02E03-1254485068289.mp4 tmp/tragic.gif ffmpeg -ss 0:5:59.600 -to 0:6:11.150 -i Downloads/Yes.Prime.Minister.S02E03-1254485068289.mp4 \ -vf "subtitles=tmp/sub.srt:force_style='Fontsize=60'" tmp/tragic_with_subtitle.gif

And this sub.srt file:

1 00:00:10,000 --> 00:00:12,000 Tragic.

I believe, one needs to install the libavfilter-extra variant to burn the subtitle in the GIF.

Some

space

to

hide

the

GIFs.

The Premier Minister just learned, that his predecessor, who was about to publish embarassing memories, died of a sudden heart attack:

I can’t actually think of a meme with this GIF, that the internal thought police community moderation would not immediately take down.

For a moment I thought that it would be fun to have a Meme-Site for Debian members. But it is probably not the right time for this.

Maybe somebody likes the above GIFs though and wants to use them somewhere.

Categories: FLOSS Project Planets

Thomas Koch: lsp-java coming to debian

Planet Debian - Tue, 2024-01-02 11:05
Posted on March 12, 2022 Tags: debian

The Language Server Protocol (LSP) standardizes communication between editors and so called language servers for different programming languages. This reduces the old problem that every editor had to implement many different plugins for all different programming languages. With LSP an editor just needs to talk LSP and can immediately provide typicall IDE features.

I already packaged the Emacs packages lsp-mode and lsp-haskell for Debian bullseye. Now lsp-java is waiting in the NEW queue.

I’m always worried about downloading and executing binaries from random places of the internet. It should be a matter of hygiene to only run binaries from official Debian repositories. Unfortunately this is not feasible when programming and many people don’t see a problem with running multiple curl-sh pipes to set up their programming environment.

I prefer to do such stuff only in virtual machines. With Emacs and LSP I can finally have a lightweight textmode programming environment even for Java.

Unfortunately the lsp-java mode does not yet work over tramp. Once this is solved, I could run emacs on my host and only isolate the code and language server inside the VM.

The next step would be to also keep the code on the host and mount it with Virtio FS in the VM. But so far the necessary daemon is not yet in Debian (RFP: #1007152).

In Detail I uploaded these packages:

Categories: FLOSS Project Planets

Thomas Koch: Waiting for a STATE folder in the XDG basedir spec

Planet Debian - Tue, 2024-01-02 11:05
Posted on February 18, 2014 Tags: debian, free software

The XDG Basedirectory specification proposes default homedir folders for the categories DATA (~/.local/share), CONFIG (~/.config) and CACHE (~/.cache). One category however is missing: STATE. This category has been requested several times but nothing happened.

Examples for state data are:

  • history files of shells, repls, anything that uses libreadline
  • logfiles
  • state of application windows on exit
  • recently opened files
  • last time application was run
  • emacs: bookmarks, ido last directories, backups, auto-save files, auto-save-list

The missing STATE category is especially annoying if you’re managing your dotfiles with a VCS (e.g. via VCSH) and you care to keep your homedir tidy.

If you’re as annoyed as me about the missing STATE category, please voice your opinion on the XDG mailing list.

Of course it’s a very long way until applications really use such a STATE directory. But without a common standard it will never happen.

Categories: FLOSS Project Planets

Cleaning up KDE's metadata - the little things matter too

Planet KDE - Tue, 2024-01-02 09:29
Cleaning up KDE's metadata – the little things matter too

Lots of my KDE contributions revolve around plugin code and their metadata, meaning I have a good overview of where and how metadata is used. In this post, I will highlight some recent changes and show you how to utilize them in your Plasma Applets and KRunner plugins!

Applet and Containment metadata

Applets (or Widgets) are one of Plasma's main selling points regarding customizability. Next to user-visible information like the name, description and categories, there is a need for some technical metadata properties. This includes X-Plasma-API-Minimum-Version for the compatible versions, the ID and the package structure, which should always be “Plasma/Applet”.

For integrating with the system tray, applets had to specify the X-Plasma-NotificationArea and X-Plasma-NotificationAreaCategory properties. The first one says that it may be shown in the system tray and the second one says in which category it belongs. But since we don't want any applets without categories in there, the first value is redundant and may be omitted! Also, it was treated like a boolean value, but only "true" or "false" were expected. I stumbled upon this when correcting the types in metadata files.
This was noticed due to preparations for improving the JSON linting we have in KDE. I will resume working on it and might also blog about it :).

What most applets in KDE specify is X-KDE-MainScript, which determines the entrypoint of the applet. This is usually “ui/main.qml”, but in some cases the value differs. When working with applets, it is confusing to first have to look at the metadata in order to find the correct file. This key was removed in Plasma6 and the file is always ui/main.qml. Since this was the default value for the property, you may even omit it for Plasma5.
The same filename convention is also enforced for QML config modules (KCMs).

What all applets needed to specify was X-Plasma-API. This is typically set to "declarativeappletscript", but we de facto never used its value, but enforced it being present. This was historically needed because in Plasma4, applets could be written in other scripting languages. From Plasma6 onward, you may omit this key.

In the Plasma repositories, this allowed me to clean up over 200 lines of JSON data.

metadata.desktop files of Plasma5 addons

Just in case you were wondering: We have migrated from metadata.desktop files to only metadata.json files in Plasma6. This makes providing metadata more consistent and efficient. In case your projects still use the old format, you can run desktoptojson -i pathto/metadata.desktop and remove the file afterward.
See https://develop.kde.org/docs/plasma/widget/properties/#kpackagestructure for more detailed information. You can even do the conversion when targeting Plasma5 users!

Default object path for KRunner DBus plugins

Another nice addition is that “/runner” will now be the default object path. This means you can omit this one key. Check out the template to get started: https://invent.kde.org/frameworks/krunner/-/tree/master/templates/runner6python. DBus-Runners that worked without deprecations in Plasma5 will continue to do so in Plasma6! For C++ plugins, I will make a porting guide like blog post soonish, because I have started porting my own plugins to work with Plasma6.

Finally, I want to wish you all a happy and productive new year!

Categories: FLOSS Project Planets

Real Python: HTTP Requests With Python's urllib.request

Planet Python - Tue, 2024-01-02 09:00

If you need to perform HTTP requests using Python, then the widely used Requests library is often the way to go. However, if you prefer to use only standard-library Python and minimize dependencies, then you can turn to urllib.request instead.

In this video course, you’ll:

  • Learn the essentials of making basic HTTP requests with urllib.request
  • Explore the inner workings of an HTTP message and how urllib.request represents it
  • Grasp the concept of handling character encodings in HTTP messages
  • Understand common hiccups when using urllib.request and learn how to resolve them

If you’re already familiar with HTTP requests such as GET and POST, then you’re well prepared for this video course. Additionally, you should have prior experience using Python to read and write files, ideally using a context manager.

In the end, you’ll discover that making HTTP requests doesn’t have to be a frustrating experience, despite its reputation. Many of the challenges people face in this process stem from the inherent complexity of the Internet. The good news is that the urllib.request module can help demystify much of this complexity.

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Palantir: Planning Your Drupal 7 Migration: Organizational Groundwork

Planet Drupal - Tue, 2024-01-02 07:00

How to ensure a smooth Drupal 7 migration with content, design, and technology audits

In this article, we focus on tackling the organizational challenges that come with the technical realities of the migration away from Drupal 7 to newer, up-to-date versions of the content management system​​. We outline a strategic blueprint for a successful Drupal 7 migration, centered around three critical audits:

  • Content audit: Evaluating which content should be carried over to the new platform.
  • Design audit: Seizing opportunities to enhance the site’s design during the rebuild.
  • Technical audit: Refining the site architecture and meticulously planning the technical aspects of the migration.

This is the perfect catalyst and opportunity for your organization to assess and not just transition, but to reconstruct your digital presence to meet your future goals and challenges.

As the clock ticks towards January 2025, the End of Life (EOL) for Drupal 7 looms on the horizon. Moving away from Drupal 7 marks a critical juncture. Since Drupal 8 and above are significantly different in architecture from Drupal 7, the move requires a comprehensive migration. The good news is that the upgrade paths from Drupal 8 and above are smoother, so the step up from Drupal 7 represents a unique, one-time effort, which will pay itself off in longer site life cycles and higher ROI on that investment.

The core principle guiding this migration strategy is the alignment of technical implementation and design with the initial content audit. This approach ensures that the rebuild is driven by content needs, rather than forcing content to conform to a predetermined site structure.

Some key organizational challenges issues when navigating this technical shift away from Drupal 7 revolve around governance, starting by understanding the existing content on your website, assigning responsibility for it, and ensuring its relevance and quality throughout the migration process.

This technical inflection point can and should also spark a broader debate about the extent of redesign and transformation needed during the migration. IT and marketing teams should be discussing, in a nutshell, “What do we have? What can be better? What do we need to make that happen?”

Palantir.net is a Drupal Association–Certified Migration Partner. We have the technical expertise, experience, and strategic insight required to help steer you through this vital transition. Whether through our Continuous Delivery Portfolio (CDP) or web modernization services, Palantir is ready to assist you in navigating the complexities of the D7 EOL upgrade.

Content audit: Decluttering the house

At the heart of any successful migration lies a well-executed content audit, a responsibility primarily shouldered by the Marketing team. This vital process streamlines the migration by identifying what content truly needs to be transferred to the new platform and what can be retired.

The essence of a content audit

The key questions to address during the audit are: What content do we need? What can we do without? These decisions should be data-driven, relying on analytics to assess the relevance and usage of the content. Key metrics include page views, content validity, and user engagement.

If you don’t like having a cluttered house, you don’t just decide to build a slightly bigger house, then move all your clutter into it. It would be a better idea to take a look at what’s actually cluttering your house first, then decide what you need to build. In the same way, letting content drive your technical decisions is the better approach.

The complexity of a given migration is often more dependent on the variety of different content types than the volume of content. Once a system for migrating a specific type of content, like blog posts, is developed, it can run autonomously while the technical team focuses on other content types. For this reason, a site with numerous content types requires a more intricate migration plan compared to one with fewer types. A content audit can help reduce the number of content types and with it the resulting effort needed.

Tips for conducting a successful content audit

 The following considerations can help make your content audit a smooth and effective process:

  • Develop a comprehensive content inventory: Start by cataloging every piece of content on your website. This step is crucial as it allows you to see the full scope of your existing content and understand what requires improvement, discarding, or migration. Document key details of each content piece, such as URLs, titles, purpose, format, number of internal links, and other relevant information.
  • Make data-driven decisions: Use tools like Google Analytics and Google Search Console to review content performance, examining metrics like traffic, engagement, bounce rates, time on page, and conversion rates. This quantitative analysis helps inform your content strategy and guides decisions on what content needs updating or removal.
  • Complement data with qualitative decisions: Compare your content against benchmarks that align with your goals and audience needs. Assess the content for user experience, compliance with your style guide, and potential improvements. Decide on actions for each content piece, such as keeping, updating, consolidating, or removing, based on their relevance, quality, and performance.
  • Involve a content strategist: An expert content strategist can help with all the above tasks and aid you in preparing a migration framework. They will help align your content with your marketing and branding goals, as well as UX design and information architecture. If you don’t have an internal content strategist, Palantir can provide one if we help you with your migration. 
Content as opportunity for a redesign

Conducting a content audit does more than just streamline the migration process. It can also unveil opportunities for a redesign of your site’s information and content architecture, aligned with a new content strategy. This process is not just about elimination, but also about discovery — uncovering what content is most valued by users. Not only are you finding out what you don’t need, but you’re hopefully finding out what's really important as well.

Given that moving from Drupal 7 to Drupal 10 essentially entails a complete site rebuild, there lies a golden opportunity to design the site around the content. This approach ensures that the site architecture complements and enhances the content, rather than forcing content to fit into a pre-existing structure.

The insights won here feed into the second crucial stage of a Drupal 7 migration: the design audit.

Design audit: An opportunity for enhancing UX

A design audit is where you and your Marketing team evaluate the current design’s effectiveness and explore the potential for a redesign. It goes hand-in-hand with a content audit.

Design audit objectives
  • Evaluate current design effectiveness: Before deciding on a redesign, critically assess how well your current design serves your content and users. Does it facilitate easy navigation? Is it aesthetically pleasing and functionally efficient?
  • Consider compatibility with Drupal 10: Drupal 10 brings new features and capabilities. The design audit of a Drupal 7 website usually reveals a rigid, template-based layout system, limiting content display versatility. By migrating to Drupal 10 and utilizing its advanced Layout Builder, the redesign can offer dynamic, user-friendly layouts, enhancing user engagement and providing flexibility for content strategy adaptations.

    Note that, while migrating away from Drupal 7, you essentially rebuild your site. Even if you choose to retain the existing design, adapting the look and feel to the newer Drupal version will require some level of reworking as well. If your existing design seems incompatible or would require extensive modifications, it might be more efficient to opt for a new design.
  • Align design with content strategy: The design should complement and enhance the content, not overshadow it. A design audit should involve close coordination with content strategists to ensure that the design facilitates the content’s purpose and enhances user engagement.
  • Explore modern design trends: Technology and design trends evolve rapidly. Use this migration as an opportunity to refresh your website’s look and feel to stay relevant and appealing to your audience.
  • Accessibility enhancement: Focus on improving the overall user experience for everyone. This includes optimizing the site for various devices and improving accessibility, for instance, compliance with A11Y guidelines.

Palantir not only offers technical expertise in migration processes but also provides skilled designers who can seamlessly collaborate with your team. Our designers are adept at working alongside content strategists. They ensure that you end up with a cohesive system that effectively supports and enhances your content strategy, ensuring that every aspect of your site’s design is driven by and aligned with your overall content goals.

Technical audit: Engineering a future-ready framework

Next up, your internal IT team should perform a comprehensive technical audit. If necessary, this stage can overlap with the content audits. However, we recommend that your migration should be primarily driven by the insights gained from your content audit.

The ultimate goal of the technical audit is preparing for a new Drupal environment. This means understanding how the identified technical elements will function in the new system and planning for any necessary adaptations or developments.

Data architecture audit

The technical audit begins with a detailed analysis of how data is structured in the current Drupal 7 site. This involves examining the entity relationships and the architecture of data storage. Understanding how different pieces of content are interlinked and how they reference each other is essential. This step not only overlaps with the content audit but also sets the stage for a smooth technical transition to Drupal 10.

Custom functionality and integration evaluation

A critical aspect of the technical audit is assessing any custom functionalities or third-party integrations present in the current system. This includes custom plug-in modules, single sign-on mechanisms, and other unique features. Each custom element that you migrate over is something you have to maintain, potentially throughout the lifetime of the site. The decision to migrate these elements should be based on their current value and necessity. During the audit, aim to determine which functionalities are essential and how they can be adapted or redeveloped for Drupal 10.

Driving collaborative decision-making

Collaboration between the IT/technical, marketing, content strategy, and design teams is vital in deciding what to keep (and migrate) and what to discard — regarding site content, architecture, code, and functionality. The technical audit, outlining the functionalities and integrations of the current site, guides the planning and decision-making process following the insights you gain from the content and design audits.

Conclusion: Charting the course of a Drupal migration

As we’ve seen, the journey away from Drupal 7 involves three main audits:

  • the content audit, which acts as a decluttering exercise;
  • the design audit, seizing opportunities to enhance user experience;
  • and the technical audit, engineering a future-ready framework. 

The content audit is the central pillar, and content strategy should drive the technical implementation and design decisions. This approach ensures a migration process where content seamlessly integrates into an efficient, updated site structure, rather than being confined by it.

Palantir is here to help and guide you through a successful migration from Drupal 7 to the future of your online digital presence. We are a Drupal Association–certified migration partner, with years of experience with intricate processes. Our expertise in content strategy, design innovation, and technical proficiency makes us an ideal full-service partner for navigating the complexities of a D7 end-of-life upgrade.

If you’re considering this critical step in your digital strategy, we invite you to explore how Palantir’s Continuous Delivery Portfolio (CDP) and web modernization services can transform your digital presence.

Content Strategy Design Drupal Drupal CDP Strategy
Categories: FLOSS Project Planets

LN Webworks: Why Media Business Should Choose Drupal As Their First Priority CMS: 5 Big Reasons

Planet Drupal - Tue, 2024-01-02 04:36

Media has changed a lot with new technology. Organizations need to connect with their audience using modern methods. Big media companies use digital tech to reach people on different platforms and make more money.

Top media networks have many websites and social media pages to reach more people. They create websites for different groups. In the age of Web 3.0, having a strong online presence is crucial. Drupal helps with that, improving the customer experience and increasing conversion by 25% for one media client we worked with.

Why is Drupal Preferred Over Other CMS? 

Media relies on content, and that content needs to bring in revenue in a scalable way. To achieve this while keeping things easy to manage, you need the right Content Management System (CMS). Considering the constant flow of new content, choosing the right CMS is crucial.

Categories: FLOSS Project Planets

Django Weblog: Django bugfix releases issued: 4.2.9 and 5.0.1

Planet Python - Tue, 2024-01-02 04:03

Today we've issued 5.0.1 and 4.2.9 bugfix releases.

The release package and checksums are available from our downloads page, as well as from the Python Package Index. The PGP key ID used for this release is Mariusz Felisiak: 2EF56372BA48CD1B.

Categories: FLOSS Project Planets

Talk Python to Me: #444: The Young Coder's Blueprint to Success

Planet Python - Tue, 2024-01-02 03:00
Are you early in your software dev or data science career? Maybe it hasn't even really started yet and you're still in school. On this episode we have Sydney Runkle who has had a ton of success in the Python space and she hasn't even graduated yet. We sit down to talk about what she's done and might do differently again to achieve that success. It's "The Young Coder's Blueprint to Success" on episode 444 of Talk Python To Me.<br/> <br/> <strong>Links from the show</strong><br/> <br/> <div><b>Sydney Runkle</b>: <a href="https://www.linkedin.com/in/sydney-runkle-105a35190/" target="_blank" rel="noopener">linkedin.com</a><br/> <b>Pydantic</b>: <a href="https://pydantic.dev" target="_blank" rel="noopener">pydantic.dev</a><br/> <b>Code Combat</b>: <a href="https://codecombat.com/play" target="_blank" rel="noopener">codecombat.com</a><br/> <b>Humanitarian Toolbox</b>: <a href="http://www.htbox.org" target="_blank" rel="noopener">www.htbox.org</a><br/> <b>PyCon 2024</b>: <a href="https://us.pycon.org/2024/" target="_blank" rel="noopener">pycon.org</a><br/> <b>Good first issue example</b>: <a href="https://github.com/pydantic/pydantic/labels/good%20first%20issue" target="_blank" rel="noopener">github.com</a><br/> <b>Watch this episode on YouTube</b>: <a href="https://www.youtube.com/watch?v=LtEYowIazVQ" target="_blank" rel="noopener">youtube.com</a><br/> <b>Episode transcripts</b>: <a href="https://talkpython.fm/episodes/transcript/444/the-young-coders-blueprint-to-success" target="_blank" rel="noopener">talkpython.fm</a><br/> <br/> <b>--- Stay in touch with us ---</b><br/> <b>Subscribe to us on YouTube</b>: <a href="https://talkpython.fm/youtube" target="_blank" rel="noopener">youtube.com</a><br/> <b>Follow Talk Python on Mastodon</b>: <a href="https://fosstodon.org/web/@talkpython" target="_blank" rel="noopener"><i class="fa-brands fa-mastodon"></i>talkpython</a><br/> <b>Follow Michael on Mastodon</b>: <a href="https://fosstodon.org/web/@mkennedy" target="_blank" rel="noopener"><i class="fa-brands fa-mastodon"></i>mkennedy</a><br/></div><br/> <strong>--- Episode sponsors ---</strong><br/> <a href='https://talkpython.fm/training'>Talk Python Training</a>
Categories: FLOSS Project Planets

Specbee: An Introduction to PHP Standard Recommendation (PSR)

Planet Drupal - Tue, 2024-01-02 02:31
Once upon a time, at a conference, the lead developers from a selection of frameworks sat down in the same room and agreed on some standards for all their projects to use. The aim was to make PHP frameworks and libraries easier to combine for users. That is when php-fig: the PHP Framework Interop Group was born. This group of awesome individuals oversees the PHP Standards Recommendations (PSRs). The PHP Standard Recommendation (PSR) is a PHP specification published by the PHP Framework Interoperability Group (PHP-FIG). It serves the standardization of programming concepts in PHP. The aim is to enable interoperability of components. The PHP-FIG is formed by several PHP frameworks founders. Dive into this article to learn about different PSRs and how you can adhere to them. PSR-0 & PSR-4 These describe a specification for auto loading classes from file paths. PSR-0 and PSR-4 are both standards concerning namespaces, class names and file paths. This PSR also describes where to place files that will be autoloaded according to the specification. Auto loading Autoloading is a functionality to help developers including  PHP classes automatically without writing cluttered include/require statements everywhere.In PHP, class's definition is loaded with require or include statements in the files they are being called i.e., prior to using it as shown below.  include 'Utility/Test Example.php'; $exampleObj = new TestExample(); The above approach raises some issues as if we have tens of external classes to be used in a file and we start writing lines of require/include statements right at the beginning of a source file.  To overcome this issue PHP 5 introduced the magic function __autoload() which is automatically called when your code references a class or interface that hasn’t been loaded yet. void _autoload (string $classname);Here’s an example of a basic __autoload() implementation:  <?php function _autoload($className) { $filename = 'Utility/’ . $className . '.php'; if (is_readable($filename)) {The major drawback to the autoload() function is that you can only provide one autoloader with it. PHP 5.1.2 introduced another autoloading function (spl_autoload_register) for coping with_autoload 's limitation. The major drawback to the __autoload() function is that you can only provide one autoloader with it. PHP 5.1.2 introduced another autoloading function (spl_autoload_register) for coping with __autoload 's limitation.  The introduction of spl_autoload_register() gave programmers the ability to create an autoload chain, a series of functions that can be called to try and load a class or interface.  For example: require $filename; } } $exampleObj= new TestExample(); <?php function utilityAutoloader($className) { $filename = 'Utility/’ , $className '.php'; if (is_readable($filename)) { require $filename; } } function functionAutoloader($className) { $filename = 'Functions/’ . $className . '.php'; if (is_readable($filename)) { require $filename; } } spl_autoload_register('utilityAutoloader'); sp1_autoload_register('functionAutoloader');Autoloading was  such a great idea that every project started to use it. Inevitably everyone created their own version of autoloader as uniform standards were lacking. Clearly, PHP desperately needed a standard for autoloader, which is how PSR-0 was born. The latest accepted autoloader standard is PSR-4.  PSR-0 (Autoloading Standard) Overview of PSR-0: A fully-qualified namespace and class must have the following structure \\(\)* Each namespace must have a top-level namespace (“Vendor Name”). Each namespace can have as many sub-namespaces as it wishes. Each namespace separator is converted to a DIRECTORY_SEPARATOR when loading from the file system. Each _ character in the CLASS NAME is converted to a DIRECTORY_SEPARATOR. The _ character has no special meaning in the namespace. The fully-qualified namespace and class are suffixed with .php when loading from the file system. Alphabetic characters in vendor names, namespaces, and class names may be of any combination of lowercase and uppercase. Examples: \Doctrine\Common\IsolatedClassLoader =>/path/to/project/lib/vendor/Doctrine/Common/IsolatedClassLoader.php \Symfony\Core\Request =>/path/to/project/lib/vendor/Symfony/Core/Request.php PSR-4 (Autoloading Standard) Overview of PSR-4: The term “class” refers to classes, interfaces, traits, and other similar structures. A fully qualified class name has the following form:\(\)*\ The fully qualified class name MUST have a top-level namespace name, also known as a “vendor namespace”. The fully qualified class name MAY have one or more sub-namespace names. The fully qualified class name MUST have a terminating class name. Underscores have no special meaning in any portion of the fully qualified class name. Alphabetic characters in the fully qualified class name MAY be any combination of lowercase and uppercase. All class names MUST be referenced in a case-sensitive fashion. Example for PSR-4 based Autoloading using Composer: Consider the following directory structure to achieve PSR-4 based autoloading using composer. Create a composer.json file using composer init. If not, you can create one manually now in your project’s root. specbee@specbee-HP-ProBook-640-G1: /var/www/html/psr$ touch composer.json Set up PSR4 autoloading by editing the composer.json file as shown below: { “autoload” : { “psr-4” : { “Codecourse\\’ : “src/” } } } Here, CodeCourse is a vendor name of your application, you can use this name while namespacing files inside of your src directory ,such as: namespace CodeCourse\Filters;Or namespace CodeCourse\Repositories;          etc, And src is your application’s directory that you want to autoload. Next, open up your terminal and type in the following command to install autoloading files in your project.This will generate the vendor directory and autoload.php file inside of it. specbee@specbee-HP-ProBook-640-G1:/var/www/html/psr$ composer dump-autoload -0 Let’s first create a couple of classes inside of the CodeCourse directory. Create AuthFilters.php inside CodeCourse/Filters The above example causes a side effect, i.e., loading a file named “file.php”. Files must be in UTF-8 without BOM(Byte Order Mark). Namespaces and class names must follow the standards in PSR-0 and PSR-4. Here is an example that illustrates the basic naming conventions for properties, classes, and methods. <?php class Classname { public $firstProperty; //Don’t declare multiple properties in a single line public static $StaticProperty; public function firstMethod() { //definition… } } ?>PSR-2 (Coding Style Guide) Overview of PSR-2: You must follow the PSR-1 coding standards. 4 spaces must be used for indents. Using tabs is not allowed. There is no limit to line length, but it should be under 120 characters, and best if under 80. There must be one blank line after namespace declaration and there must be one blank line after the block of use declaration. Opening curly braces for classes and methods must go on the next line and closing curly braces must go on the line after the body. Methods and properties must be defined with abstract/final first, followed with public/protected, and finally static keyword. You must not put a newline before curly braces in conditional statements. You must not put any spaces before ( and ) in conditional statements. An example for defining classes: You must open the curly braces on the new line and the extends and the implements keyword must be used in a single line. <?php class ClassName extends ParentClass implements InterfaceName { //class definition… } ?>If there are multiple interfaces to implement, then you can write the interface names in the new line as shown below: <?php class ClassName extends ParentClass implements InterfaceName1, InterfaceName2, InterfaceName3 { //class definition.. } ?>Example to show how methods are defined in PHP:While defining the methods, the arguments should be written in the same line. Also, you must not put any whitespaces before commas in arguments, and you must put one whitespace after them. <?php class ClassName { public function method($arg1, $arg2) { //definition… } } ?>If there are many number of arguments, then they can be written in newline one after the other: <?php class ClassName { public function method( arg1, arg2, arg3) { //definition… } } ?> When defining methods, you must have either one of public/protected/private and abstract/final. The visibility modes come after the abstract/final keyword, if used. static is the last modifier. <?php class ClassName { abstract public function abstractMethod(); final public static function staticMethod() { //definition… } } ?>Conditional Statements You must put one whitespace before ( You must not put any whitespaces after ( You must not put any whitespaces before ) You must put one whitespace after ) use elseif rather than else if. Example to show the difference between elseif and else if:Interpretation of elseif: <?php if ($condition1) { //… } elseif ($condition2) { //… } else { //… } ?>Interpretation of else if: <?php if ($condition1) { //… } else { if (condition2) { //… } } ?>For the switch statements,  The curly braces must be opened in the same line where the switch statement is written. The case body must be indented once from the case and the case must be indented once from the switch. Use no break when break is not needed. You can also use return instead of break. Example: <?php switch($condition) { case 0: echo ‘Use a break’ ; break; case1: echo ‘If you are not using break; then write no break as a comment’ ; // no break case 2: echo ‘Use return instead of break’ ; return; default: echo ‘Default’; break; } ?>Huge shoutout to Samvada Jain for her contributions to this article. Final Thoughts In a project that is incorporated with various packages, it can be a mess if each individual uses a different coding standards. This is the reason why PSR was designed. In total, there are over 20 PSRs that are designed and each PSR is suggested by members and voted according to an established protocol to act consistently and in line with their agreed upon processes. Our expertise in PHP stems from our focus on Drupal - An enterprise CMS built using PHP. If you are looking to develop a custom module for Drupal or any other Drupal development services, talk to us today!    
Categories: FLOSS Project Planets

Valhalla's Things: Crescent Shawl

Planet Debian - Mon, 2024-01-01 19:00
Posted on January 2, 2024

One of the knitting projects I’m working on is a big bottom-up triangular shawl in less-than-fingering weight yarn (NM 1/15): it feels like a cloud should by all rights feel, and I have good expectations out of it, but it’s taking forever and a day.

And then one day last spring I started thinking in the general direction of top-down shawls, and decided I couldn’t wait until I had finished the first one to see if I could design one.

For my first attempt I used an odd ball of 50% wool 50% plastic I had in my stash and worked it on 12 mm tree trunks, and I quickly made something between a scarf and a shawl that got some use during the summer thunderstorms when temperatures got a bit lower, but not really cold. I was happy with the shape, not with the exact position of the increases, but I had ideas for improvements, so I just had to try another time.

Digging through the stash I found four balls of Drops Alpaca in two shades of grey: I had bought it with the intent to test its durability in somewhat more demanding situations (such as gloves or even socks), but then the LYS1 no longer carries it, so I might as well use it for something a bit more one-off (and when I received the yarn it felt so soft that doing something for the upper body looked like a better idea anyway).

I decided to start working in garter stitch with the darker colour, then some garter stitch in the lighter shade and to finish with yo / k2t lace, to make the shawl sort of fade out.

The first half was worked relatively slowly through the summer, and then when I reached the colour change I suddenly picked up working on it and it was finished in a couple of weeks.

looks denser in a nice way, but the the lace border is scrunched up.

Then I had doubts on whether I wanted to block it, since I liked the soft feel, but I decided to try it anyway: it didn’t lose the feel, and the look is definitely better, even if it was my first attempt at blocking a shawl and I wasn’t that good at it.

I’m glad that I did it, however, as it’s still soft and warm, but now also looks nicer.

The pattern is of course online as #FreeSoftWear on my fiber craft patterns website.

  1. at least local to somebody: I can’t get to a proper yarn shop by foot, so I’ve bought this yarn online from one that I could in theory reach on a day trip, but it has not happened yet.↩︎

Categories: FLOSS Project Planets

Tellico 3.5.3 Released

Planet KDE - Mon, 2024-01-01 17:52

Tellico 3.5.3 is available, with a few minor clean-ups.

Improvements and Bug Fixes
  • Improved some entry matching heuristics when updating from other sources.
  • Updated the author search for the Open Library data source.
  • Updated Kino-Teatr data source.
  • Fixed compilation for versions of KDE Frameworks < 5.94.
  • Fixed layout bug in Fancy template for custom collections with no image.
Categories: FLOSS Project Planets

Russ Allbery: 2023 Book Reading in Review

Planet Debian - Mon, 2024-01-01 17:06

In 2023, I finished and reviewed 53 books, continuing a trend of year-over-year increases and of reading the most books since 2012 (the last year I averaged five books a month). Reviewing continued to be uneven, with a significant slump in the summer and smaller slumps in February and November, and a big clump of reviews finished in October in addition to my normal year-end reading and reviewing vacation.

The unevenness this year was mostly due to finishing books and not writing reviews immediately. Reviews are much harder to write when the finished books are piling up, so one goal for 2024 is to not let that happen again. I enter the new year with one book finished and not yet reviewed, after reading a book about every day and a half during my December vacation.

I read two all-time favorite books this year. The first was Emily Tesh's debut novel Some Desperate Glory, which is one of the best space opera novels I have ever read. I cannot improve on Shelley Parker-Chan's blurb for this book: "Fierce and heartbreakingly humane, this book is for everyone who loved Ender's Game, but Ender's Game didn't love them back." This is not hard science fiction but it is fantastic character fiction. It was exactly what I needed in the middle of a year in which I was fighting a "burn everything down" mood.

The second was Night Watch by Terry Pratchett, the 29th Discworld and 6th Watch novel. Throughout my Discworld read-through, Pratchett felt like he was on the cusp of a truly stand-out novel, one where all the pieces fit and the book becomes something more than the sum of its parts. This was that book. It's a book about ethics and revolutions and governance, but also about how your perception of yourself changes as you get older. It does all of the normal Pratchett things, just... better. While I would love to point new Discworld readers at it, I think you do have to read at least the Watch novels that came before it for it to carry its proper emotional heft.

This was overall a solid year for fiction reading. I read another 15 novels I rated 8 out of 10, and 12 that I rated 7 out of 10. The largest contributor to that was my Discworld read-through, which was reliably entertaining throughout the year. The run of Discworld books between The Fifth Elephant (read late last year) and Wintersmith (my last of this year) was the best run of Discworld novels so far. One additional book I'll call out as particularly worth reading is Thud!, the Watch novel after Night Watch and another excellent entry.

I read two stand-out non-fiction books this year. The first was Oliver Darkshire's delightful memoir about life as a rare book seller, Once Upon a Tome. One of the things I will miss about Twitter is the regularity with which I stumbled across fascinating people and then got to read their books. I'm off Twitter permanently now because the platform is designed to make me incoherently angry and I need less of that in my life, but it was very good at finding delightfully quirky books like this one.

My other favorite non-fiction book of the year was Michael Lewis's Going Infinite, a profile of Sam Bankman-Fried. I'm still bemused at the negative reviews that this got from people who were upset that Lewis didn't turn the story into a black-and-white morality play. Bankman-Fried's actions were clearly criminal; that's not in dispute. Human motivations can be complex in ways that are irrelevant to the law, and I thought this attempt to understand that complexity by a top-notch storyteller was worthy of attention.

Also worth a mention is Tony Judt's Postwar, the first book I reviewed in 2023. A sprawling history of post-World-War-II Europe will never have the sheer readability of shorter, punchier books, but this was the most informative book that I read in 2023.

2024 should see the conclusion of my Discworld read-through, after which I may return to re-reading Mercedes Lackey or David Eddings, both of which I paused to make time for Terry Pratchett. I also have another re-read similar to my Chronicles of Narnia reviews that I've been thinking about for a while. Perhaps I will start that next year; perhaps it will wait for 2025.

Apart from that, my intention as always is to read steadily, write reviews as close to when I finished the book as possible, and make reading time for my huge existing backlog despite the constant allure of new releases. Here's to a new year full of more new-to-me books and occasional old favorites.

The full analysis includes some additional personal reading statistics, probably only of interest to me.

Categories: FLOSS Project Planets

Petter Reinholdtsen: Welcome out of prison, Mickey, hope you find some freedom!

Planet Debian - Mon, 2024-01-01 15:00

Today, the animation figure Mickey Mouse finally was released from the corporate copyright prison, as the 1928 movie Steamboat Willie entered the public domain in USA. This movie was the first public appearance of Mickey Mouse. Sadly the figure is still on probation, thanks to trademark laws and a the Disney corporations powerful pack of lawyers, as described in the 2017 article in "How Mickey Mouse Evades the Public Domain" from Priceonomics. On the positive side, the primary driver for repeated extentions of the duration of copyright has been Disney thanks to Mickey Mouse and the 2028 movie, and as it now in the public domain I hope it will cause less urge to extend the already unreasonable long copyright duration.

The first book I published, the 2004 book "Free Culture" by Lawrence Lessig, published 2015 in English, French and Norwegian Bokmål, touch on the story of Disney pushed for extending the copyright duration in USA. It is a great book explaining problems with the current copyright regime and why we need Creative Commons movement, and I strongly recommend everyone to read it.

This movie (with IMDB ID tt0019422) is now available from the Internet Archive. Two copies have been uploaded so far, one uploaded 2015-11-04 (torrent) and the other 2023-01-01 (torrent) - see VLC bittorrent plugin for streaming the video using the torrent link. I am very happy to see the number of public domain movies increasing. I look forward to when those are the majority. Perhaps it will reduce the urge of the copyright industry to control its customers.

A more comprehensive list of works entering the public domain in 2024 is available from the Public Domain Review.

As usual, if you use Bitcoin and want to show your support of my activities, please send Bitcoin donations to my address 15oWEoG9dUPovwmUL9KWAnYRtNJEkP1u1b.

Categories: FLOSS Project Planets

Tim Retout: Prevent DOM-XSS with Trusted Types — a smarter DevSecOps approach

Planet Debian - Mon, 2024-01-01 07:46

It can be incredibly easy for a frontend developer to accidentally write a client-side cross-site-scripting (DOM-XSS) security issue, and yet these are hard for security teams to detect. Vulnerability scanners are slow, and suffer from false positives. Can smarter collaboration between development, operations and security teams provide a way to eliminate these problems altogether?

Google claims that Trusted Types has all but eliminated DOM-XSS exploits on those of their sites which have implemented it. Let’s find out how this can work!

DOM-XSS vulnerabilities are easy to write, but hard for security teams to catch

It is very easy to accidentally introduce a client-side XSS problem. As an example of what not to do, suppose you are setting an element’s text to the current URL, on the client side:

// Don't do this para.innerHTML = location.href;

Unfortunately, an attacker can now manipulate the URL (and e.g. send this link in a phishing email), and any HTML tags they add will be interpreted by the user’s browser. This could potentially be used by the attacker to send private data to a different server.

Detecting DOM-XSS using vulnerability scanning tools is challenging - typically this requires crawling each page of the website and attempting to detect problems such as the one above, but there is a significant risk of false positives, especially as the complexity of the logic increases.

There are already ways to avoid these exploits — developers should validate untrusted input before making use of it. There are libraries such as DOMPurify which can help with sanitization.1

However, if you are part of a security team with responsibility for preventing these issues, it can be complex to understand whether you are at risk. Different developer teams may be using different techniques and tools. It may be impossible for you to work closely with every developer — so how can you know that the frontend team have used these libraries correctly?

Trusted Types closes the DevSecOps feedback loop for DOM-XSS, by allowing Ops and Security to verify good Developer practices

Trusted Types enforces sanitization in the browser2, by requiring the web developer to assign a particular kind of JavaScript object rather than a native string to .innerHTML and other dangerous properties. Provided these special types are created in an appropriate way, then they can be trusted not to expose XSS problems.

This approach will work with whichever tools the frontend developers have chosen to use, and detection of issues can be rolled out by infrastructure engineers without requiring frontend code changes.

Content Security Policy allows enforcement of security policies in the browser itself

Because enforcing this safer approach in the browser for all websites would break backwards-compatibility, each website must opt-in through Content Security Policy headers.

Content Security Policy (CSP) is a mechanism that allows web pages to restrict what actions a browser should execute on their page, and a way for the site to receive reports if the policy is violated.

Figure 1: Content-Security-Policy browser communication

This is revolutionary, because it allows servers to receive feedback in real time on errors that may be appearing in the browser’s console.

Trusted Types can be rolled out incrementally, with continuous feedback

Web.dev’s article on Trusted Types explains how to safely roll out the feature using the features of CSP itself:

  • Deploy a CSP collector if you haven’t already
  • Switch on CSP reports without enforcement (via Content-Security-Policy-Report-Only headers)
  • Iteratively review and fix the violations
  • Switch to enforcing mode when there are a low enough rate of reports

Static analysis in a continuous integration pipeline is also sensible — you want to prevent regressions shipping in new releases before they trigger a flood of CSP reports. This will also give you a chance of finding any low-traffic vulnerable pages.

Smart security teams will use techniques like Trusted Types to eliminate entire classes of bugs at a time

Rather than playing whack-a-mole with unreliable vulnerability scanning or bug bounties, techniques such as Trusted Types are truly in the spirit of ‘Secure by Design’ — build high quality in from the start of the engineering process, and do this in a way which closes the DevSecOps feedback loop between your Developer, Operations and Security teams.

  1. Sanitization libraries are especially needed when the examples become more complex, e.g. if the application must manipulate the input. DOMPurify version 1.0.9 also added Trusted Types support, so can still be used to help developers adopt this feature. ↩︎

  2. Trusted Types has existed in Chrome and Edge since 2020, and should soon be coming to Firefox as well. However, it’s not necessary to wait for Firefox or Safari to add support, because the large market share of Chrome and Edge will let you identify and fix your site’s DOM-XSS issues, even if you do not set enforcing mode, and users of all browsers will benefit. Even so, it is great that Mozilla is now on board. ↩︎

Categories: FLOSS Project Planets

Pages