Feeds

The Drop Times: DrupalCon Barcelona Day 2 - Drupal AI, Ethical hacking and Digital communities

Planet Drupal - Thu, 2024-09-26 02:40
Day 2 of DrupalCon Europe 2024 in Barcelona offered thrilling sessions, including insights into AI's role in web development and the inspiring mission of "Defend Iceland" to safeguard digital spaces. With discussions on enhancing digital ecosystems and vibrant community building, the day left attendees eager for more—before unwinding with beach parties and a night at Camp Nou.
Categories: FLOSS Project Planets

The Drop Times: DrupalCon Barcelona 2024 Keynote: The Vision Behind 'Defend Iceland'

Planet Drupal - Thu, 2024-09-26 01:40
At DrupalCon Barcelona 2024, Theódór Ragnar Gíslason, CTO of Syndis, presented Defend Iceland, a pioneering initiative aimed at building safer digital communities through a nationwide bug bounty platform. This community-driven approach empowers citizens, ethical hackers, and organizations to collaborate on strengthening cybersecurity. In his keynote, Theódór shared how Iceland’s model for collective cyber resilience can be scaled globally, offering a fresh perspective on tackling digital threats in an increasingly interconnected world.
Categories: FLOSS Project Planets

Mike Driscoll: Textual – The New MaskedInput Widget

Planet Python - Wed, 2024-09-25 22:11

Textual v0.80.0 was released today, and it included the brand-new MaskedInput widget. If you have used other GUI toolkits, such as wxPython, you might already be familiar with a masked input widget. These widgets allow you to control the user’s input based on a mask string that the developer provides when instantiating the widget.

Let’s spend a few brief moments learning how this new widget works.

Getting the Latest Textual

Before you can use the MaskedInput widget, you must ensure you have version 0.80.0 or greater. If you have an older version, then you’ll need to upgrade by running the following command in your terminal:

python -m pip install textual --upgrade

Now that you have a 0.80.0 or greater, you can use this great widget!

Using the MaskedInput Widget

The Textual documentation has a credit card masked string for their demo application. Let’s start by looking at that example:

from textual.app import App, ComposeResult from textual.widgets import Label, MaskedInput class MaskedInputApp(App): # (1)! CSS = """ MaskedInput.-valid { border: tall $success 60%; } MaskedInput.-valid:focus { border: tall $success; } MaskedInput { margin: 1 1; } Label { margin: 1 2; } """ def compose(self) -> ComposeResult: yield Label("Enter a valid credit card number.") yield MaskedInput( template="9999-9999-9999-9999;0", # (2)! ) if __name__ == "__main__": app = MaskedInputApp() app.run()

The template in this code says that you want four groups of four digits. The template automatically adds a dash after each four digits except the last one.

When you run this code, you will see something like this:

Note that when the widget is empty or not completely filled out, there is a red outline around it. The widget will ignore all keys except for number keys. So if you press A-Z or any special characters like semi-colon or question mark, the MasketInput widget will ignore them and nothing appears in the widget.

When you have entered four groups of four integers though, you will see the red outline turn green. Here’s an example:

Note that this code does NOT actually validate that these are working credit card numbers. It only validates that there four groups of four integers.

A more common mask would be to create something to accept phone numbers. Let’s take the code above and rewrite the mask and the label text accordingly:

from textual.app import App, ComposeResult from textual.widgets import Label, MaskedInput class MaskedInputApp(App): # (1)! CSS = """ MaskedInput.-valid { border: tall $success 60%; } MaskedInput.-valid:focus { border: tall $success; } MaskedInput { margin: 1 1; } Label { margin: 1 2; } """ def compose(self) -> ComposeResult: yield Label("Enter a valid phone number.") yield MaskedInput( template="(999)-999-9999;0", ) if __name__ == "__main__": app = MaskedInputApp() app.run()

If you runt his code, the output it a little different, but the concept is the same:

The MaskedInput widget is using regular expressions to mask characters. If you look at the documentation you can see how it works. Following are a few examples from the documentation:

A [A-Za-z] Yes a [A-Za-z] No N [A-Za-z0-9] Yes n [A-Za-z0-9] No X [^ ] Yes x [^ ] No 9 [0-9] Yes

This widget may get additional updates to include more refined regular expressions, so it is definitely best to review the documentation for the latest information on the masked templates and how they work.

Wrapping Up

The MaskedInput widget is a great addition to the Textual TUI toolkit. There are lots of great widgets included with Textual now. While there may not be as many as wxPython or PyQt, those projects have been around for a couple of decades and have had much more time to mature. You can create really amazing and neat applications in your terminal with Python and Textual. You should try it out today!

The post Textual – The New MaskedInput Widget appeared first on Mouse Vs Python.

Categories: FLOSS Project Planets

Russell Coker: The PiKVM

Planet Debian - Wed, 2024-09-25 19:01
Hardware

I have just setup a PiKVM, here’s the Amazon link for the KVM hardware (case and Pi hat etc) and here’s an Amazon link for a Pi4 to match.

The PiKVM web site has good documentation [1] and they have a YouTube channel with videos showing how to assemble the devices [2]. It’s really convenient being able to change the playback speed from low speeds like 1/4 original speed) to double speed when watching such a video. One thing to note is that there are some revisions to the hardware that aren’t covered in the videos, the device I received had some improvements that made it easier to assemble which weren’t in the video.

When you buy the device and Pi you need to also get a SD card of at least 4G in size, a CR1220 battery for real-time clock, and a USB-2/3 to USB-C cable for keyboard/mouse MUST NOT BE USB-C to USB-C! When I first tried using it I used a USB-C to USB-C cable for keyboard and mouse and it didn’t work for reasons I don’t understand (I welcome comments with theories about this). You also need a micro-HDMI to HDMI cable to get video output if you want to set it up without having to find the IP address and ssh to it.

The system has a bright OLED display to show the IP address and some other information which is very handy.

The hardware is easy enough for a 12yo to setup. The construction of the parts are solid and well engineered with everything fitting together nicely. It has a PCI/PCIe slot adaptor for controlling power and sending LED status over the connection which I didn’t test. I definitely recommend this.

Software

This is the download link for the RaspberryPi images for the PiKVM [3]. The “v3” image matches the hardware from the Amazon link I provided.

The default username/password is root/root. Connect it to a HDMI monitor and USB keyboard to change the password etc. If you control the DHCP server you can find the IP address it’s using and ssh to it to change the password (it is configured to allow ssh as root with password authentication).

If you get the kit to assemble it (as opposed to buying a completed unit already assembled) then you need to run the following commands as root to enable the OLED display. This means that after assembling it you can’t get the IP address without plugging in a monitor with a micro-HDMI to HDMI cable or having access to the DHCP server logs.

rw systemctl enable --now kvmd-oled kvmd-oled-reboot kvmd-oled-shutdown systemctl enable --now kvmd-fan ro

The default webadmin username/password is admin/admin.

To change the passwords run the following commands:

rw kvmd-htpasswd set admin passwd root ro

It is configured to have the root filesystem mounted read-only which is something I thought had gone out of fashion decades ago. I don’t think that modern versions of the Ext3/4 drivers are going to corrupt your filesystem if you have it mounted read-write when you reboot.

By default it uses a self-signed SSL certificate so with a Chrome based browser you get an error when you connect where you have to select “advanced” and then tell it to proceed regardless. I presume you could use the DNS method of Certbot authentication to get a SSL certificate to use on an internal view of your DNS to make it work normally with SSL.

The web based software has all the features you expect from a KVM. It shows the screen in any resolution up to 1920*1080 and proxies keyboard and mouse. Strangely “lsusb” on the machine being managed only reports a single USB device entry for it which covers both keyboard and mouse.

Managing Computers

For a tower PC disconnect any regular monitor(s) and connect a HDMI port to the HDMI input on the KVM. Connect a regular USB port (not USB-C) to the “OTG” port on the KVM, then it should all just work.

For a laptop connect the HDMI port to the HDMI input on the KVM. Connect a regular USB port (not USB-C) to the “OTG” port on the KVM. Then boot it up and press Fn-F8 for Dell, Fn-F7 for Lenovo or whatever the vendor code is to switch display output to HDMI during the BIOS initialisation, then Linux will follow the BIOS and send all output to the HDMI port for the early stages of booting. Apparently Lenovo systems have the Fn key mapped in the BIOS so an external keyboard could be used to switch between display outputs, but the PiKVM software doesn’t appear to support that. For other systems (probably including the Dell laptops that interest me) the Fn key apparently can’t be simulated externally. So for using this to work on laptops in another city I need to have someone local press Fn-F8 at the right time to allow me to change BIOS settings.

It is possible to configure the Linux kernel to mirror display to external HDMI and an internal laptop screen. But this doesn’t seem useful to me as the use cases for this device don’t require that. If you are using it for a server that doesn’t have iDRAC/ILO or other management hardware there will be no other “monitor” and all the output will go through the only connected HDMI device. My main use for it in the near future will be for supporting remote laptops, when Linux has a problem on boot as an easier option than talking someone through Linux commands and for such use it will be a temporary thing and not something that is desired all the time.

For the gdm3 login program you can copy the .config/monitors.xml file from a GNOME user session to the gdm home directory to keep the monitor settings. This configuration option is decent for the case where a fixed set of monitors are used but not so great if your requirement is “display a login screen on anything that’s available”. Is there an xdm type program in Debian/Ubuntu that supports this by default or with easy reconfiguration?

Conclusion

The PiKVM is a well engineered and designed product that does what’s expected at a low price. There are lots of minor issues with using it which aren’t the fault of the developers but are due to historical decisions in the design of BIOS and Linux software. We need to change the Linux software in question and lobby hardware vendors for BIOS improvements.

The feature for connecting to an ATX PSU was unexpected and could be really handy for some people, it’s not something I have an immediate use for but is something I could possibly use in future. I like the way they shipped the hardware for it as part of the package giving the user choices about how they use it, many vendors would make it an optional extra that costs another $100. This gives the PiKVM more functionality than many devices that are much more expensive.

The web UI wasn’t as user friendly as it might have been, but it’s a lot better than iDRAC so I don’t have a serious complaint about it. It would be nice if there was an option for creating macros for keyboard scancodes so I could try and emulate the Fn options and keys for volume control on systems that support it.

Related posts:

  1. Dell 32″ 4K Monitor and DisplayPort Switch After determining that the Philips 43″ monitor was too large...
  2. DDC as a KVM Switch With the recent resurgence in Covid19 I’ve been working from...
  3. Thinkpad X1 Carbon Gen5 Gen1 Since February 2018 I have been using a Thinkpad...
Categories: FLOSS Project Planets

Jonathan Dowland: whisper

Planet Debian - Wed, 2024-09-25 16:34
Whisper (pipewire tool)

It's time to mint a new blog tag…

I want to write to pour praise on some software I recently discovered.

I'm not up to speed on Pipewire—the latest piece of Linux plumbing related to audio—nor how it relates to the other bits (Pulseaudio, ALSA, JACK, what else?). I recently tried to plug something into the line-in port on my external audio interface, and wished to hear it on the machine. A simple task, you'd think.

I'll refrain from writing about the stuff that didn't work well and focus on the thing that did: A little tool called Whisper, which is designed to let you listen to a microphone through your speakers.

Whisper's UI. Screenshot from upstream.

Whisper does a great job of hiding the complexity of what lies beneath and asking two questions: which microphone, and which speakers? In my case this alone was not quite enough, as I was presented with two identically-named "SB Live Extigy" "microphone" devices, but that's easily resolved with trial and error.

More stuff like this please!

Categories: FLOSS Project Planets

mark.ie: My LocalGov Drupal contributions for week-ending September 27th, 2024

Planet Drupal - Wed, 2024-09-25 14:14

I spent a lot of time this week working on LocalGov Base. It's great to have a strong core that all other themes can build upon.

Categories: FLOSS Project Planets

FSF Events: Free Software Directory meeting on IRC: Friday, September 27, starting at 12:00 EDT (16:00 UTC)

GNU Planet! - Wed, 2024-09-25 13:27
Join the FSF and friends on Friday, September 27 from 12:00 to 15:00 EDT (16:00 to 19:00 UTC) to help improve the Free Software Directory.
Categories: FLOSS Project Planets

Django Weblog: 2025 DSF Board Nominations

Planet Python - Wed, 2024-09-25 12:03

Nominations are open for the 2025 Django Software Foundation Board of Directors.

In 2023 we introduced a staggered term for directors. Of our 7 directors, there are 4 positions currently open, with each position serving for two years.

Decisions around the 2025 officer roles will be made during the meeting of the new board. You don’t need to specify which position you are nominating for.

As you know, the Board guides the direction of the marketing, governance and outreach activities of the Django community. We provide funding, resources, and guidance to Django events on a global level. Further we provide support to the Django community with an established Code of Conduct and make decisions and enforcement recommendations for violations. We work with our corporate and individual members to raise funds to help support our great community.

In order for our community to continue to grow and advance the Django Web framework, we need your help. The Board of Directors consists of seven volunteers who are elected to two year terms. This is an excellent opportunity to help advance Django. We can’t do it without volunteers, such as yourself. Anyone including current Board members, DSF Members, or the public at large can apply to the Board. It is open to all.

2025 DSF Board Nomination Form

If you are interested in helping to support the development of Django we’d enjoy receiving your application for the Board of Directors. Please fill out the 2025 DSF Board Nomination form by October 25, 2024 Anywhere on Earth to be considered.

If you have any questions about applying, the work, or the process in general please don’t hesitate to reach out via email to foundation@djangoproject.com.

Thank you for your time and we look forward to working with you in 2025.

The 2024 DSF Board of Directors.

Categories: FLOSS Project Planets

libtool @ Savannah: libtool-2.5.3 released [stable]

GNU Planet! - Wed, 2024-09-25 11:57

Libtoolers!

The Libtool Team is pleased to announce the release of libtool 2.5.3.

GNU Libtool hides the complexity of using shared libraries behind a
consistent, portable interface. GNU Libtool ships with GNU libltdl, which
hides the complexity of loading dynamic runtime libraries (modules)
behind a consistent, portable interface.

There have been 14 commits by 2 people in the 27 days since 2.5.2.

See the NEWS below for a brief summary. An alpha and two beta releases
of GNU Libtool have been released prior to this stable release. Please
view the NEWS entries for those releases for a more complete summary of
the updates between stable releases 2.4.7 and 2.5.3.

Thanks to everyone who has contributed!
The following people contributed changes to this release:

  Bruno Haible (3)
  Ileana Dumitrescu (11)

Ileana
 [on behalf of the libtool maintainers]
==================================================================

Here is the GNU libtool home page:
    https://gnu.org/s/libtool/

For a summary of changes and contributors, see:
  https://git.sv.gnu.org/gitweb/?p=libtool.git;a=shortlog;h=v2.5.3
or run this command from a git-cloned libtool directory:
  git shortlog v2.5.2..v2.5.3

Here are the compressed sources:
  https://ftpmirror.gnu.org/libtool/libtool-2.5.3.tar.gz   (2.0MB)
  https://ftpmirror.gnu.org/libtool/libtool-2.5.3.tar.xz   (1.1MB)

Here are the GPG detached signatures:
  https://ftpmirror.gnu.org/libtool/libtool-2.5.3.tar.gz.sig
  https://ftpmirror.gnu.org/libtool/libtool-2.5.3.tar.xz.sig

Use a mirror for higher download bandwidth:
  https://www.gnu.org/order/ftp.html

Here are the SHA1 and SHA256 checksums:

  f48e2fcdb0b80f97e93366c41fdcd1ea90f2f253  libtool-2.5.3.tar.gz
  kyK9j2vISP2j44WJndGTSVcWllKs73FtGdGdJAU6u5U=  libtool-2.5.3.tar.gz
  f1450b2f652d9acf3b83eee823cad966a149cca4  libtool-2.5.3.tar.xz
  iYARIyzFm2s7u+Mhtgq6nbGsEVeKth7Q3wKZRYFGri4=  libtool-2.5.3.tar.xz

Verify the base64 SHA256 checksum with cksum -a sha256 --check
from coreutils-9.2 or OpenBSD's cksum since 2007.

Use a .sig file to verify that the corresponding file (without the
.sig suffix) is intact.  First, be sure to download both the .sig file
and the corresponding tarball.  Then, run a command like this:

  gpg --verify libtool-2.5.3.tar.gz.sig

The signature should match the fingerprint of the following key:

  pub   rsa4096 2021-09-23 [SC]
        FA26 CA78 4BE1 8892 7F22  B99F 6570 EA01 146F 7354
  uid   Ileana Dumitrescu <ileanadumi95@protonmail.com>
  uid   Ileana Dumitrescu <ileanadumitrescu95@gmail.com>

If that command fails because you don't have the required public key,
or that public key has expired, try the following commands to retrieve
or refresh it, and then rerun the 'gpg --verify' command.

  gpg --locate-external-key ileanadumi95@protonmail.com

  gpg --recv-keys 6570EA01146F7354

  wget -q -O- 'https://savannah.gnu.org/project/release-gpgkeys.php?group=libtool&download=1' | gpg --import -

As a last resort to find the key, you can try the official GNU
keyring:

  wget -q https://ftp.gnu.org/gnu/gnu-keyring.gpg
  gpg --keyring gnu-keyring.gpg --verify libtool-2.5.3.tar.gz.sig

This release was bootstrapped with the following tools:
  Autoconf 2.72e
  Automake 1.17
  Gnulib v1.0-803-g30417e7f91

NEWS

  • Noteworthy changes in release 2.5.3 (2024-09-25) [stable]


** New features:

  - Add 'aarch64' support to the file magic test, which allows for
    shared libraries to be built with Mingw for aarch64.

** Bug fixes:

  - The configure options --with-pic and --without-pic have been renamed
    to --enable-pic and --disable-pic, respectively.  The old names
    --with-pic and --without-pic are still supported, though, for
    backward compatibility.

  - The configure option --with-aix-soname has been renamed to
    --enable-aix-soname.  The old name --with-aix-soname is still
    supported, though, for backward compatibility.

  - Fix conflicting warnings about AC_PROG_RANLIB.

  - Document situations where -export-symbols does not work.

  - Update FSF office address with URL in each file's license block.

  - Add checks for aclocal in standalone.at and subproject.at test files
    that report failures in Linux From Scratch and Darwin builds.
   

Enjoy!

Categories: FLOSS Project Planets

Drupal Association blog: Introducing the Revolutionary Drupal CMS at DrupalCon Barcelona: A New Era in Drupal with AI-Enabled Website Building

Planet Drupal - Wed, 2024-09-25 10:59

BARCELONA, Spain, 25 September 2024 — This September, the digital world witnessed a groundbreaking moment at DrupalCon Europe in Barcelona ( 23-27 September), as Dries Buytaert, the visionary founder of Drupal, unveils a preview of Drupal CMS in his landmark 40th Driesnote address. This launch isn't just an announcement; We're standing at the threshold of a new era in the Drupal story, marking the most significant evolution since its birth in 2001.

Drupal CMS leverages existing enterprise-grade security and scalability paired with the adoption of AI abilities and revolutionizing user experience with Experience Builder, all wrapped in an easy-to-use interface that any skill level can use.

“The product strategy is for Drupal CMS to be the gold standard for no-code website building,” said Dries Buytaert, Drupal Founder and Project Lead. “Our goal is to empower non-technical users like digital marketers, content creators, and site-builders to create exceptional digital experiences without requiring developers… We are not abandoning Drupal Core, developers, or enterprise users. Drupal CMS is built on Drupal Core, so we need to make sure Drupal Core does well. The way to think about Drupal CMS is a product or strategy that compliments what we already do.”

Drupal CMS: The Future Unveils in Early 2025

Imagine a world where the Enterprise-class power of Drupal is not just for the technically adept but is accessible to everyone. This is the reality Drupal CMS brings to the table. Tailor-made to empower marketers, web designers, and organizations, the preconfigured options are the key to effortlessly crafting and managing websites that stand out. 

Why Drupal CMS is a Game-Changer  
  • Smart Defaults and Pre-Built Solutions: Out-of-the-box readiness meets customizable solutions—whether for corporate sites, blogs, or marketing platforms, enriched with advanced site-building capabilities. 

  • Seamless Onboarding: The intuitive install process is like having a guide, helping you cherry-pick the features your project needs.

  • AI-Powered Efficiency: With cutting-edge AI tools such as A.I. site migration and alt text creation, site building accelerates as you edit content types, set up fields, craft taxonomies, and more—with AI agents transparently handling various website actions.

  • Advanced SEO: Site builders can easily use powerful SEO features such as sitemaps, focus keywords, metatags, and more with minimal effort

  • Open-Source Power: Drupal CMS thrives on open-source collaboration, harnessing the collective innovation of a global community. Fully customizable and endless extensions prevent vendor lock-in.

Ready to Dive In? 

Curious minds can explore a demo in the Driesnote today —experience the future of web building firsthand. And stay tuned: the official launch is set for 15 January 2025, with even more innovations on the horizon.

A Responsible Approach to AI  

In line with Drupal's open web manifesto, a responsible AI policy ensures every AI feature in Drupal CMS is crafted with ethical considerations at its core—because innovation should go hand-in-hand with integrity. 

Join us on this exhilarating journey as we step into the future with Drupal CMS—where imagination meets innovation, and building websites will never be the same.

For a closer look at Dries's presentation and to explore Drupal CMS, watch his full Driesnote.

About DrupalCon

DrupalCon is the flagship event for the global Drupal community, bringing together thousands of developers, designers, marketers, and business leaders to connect, learn, and share knowledge about Drupal. Held in various cities around the world, DrupalCon features keynotes, workshops, and networking opportunities that highlight the latest innovations in the Drupal ecosystem.

About Dries Buytaert

Belgium-born Drupal founder Dries Buytaert is a pioneer in the Open Source web publishing and digital experience platform space. He is the Founder of Drupal, the Open Source software for building websites and digital experiences. Drupal is one of the largest and most active Open Source projects in the world. Dries has been working on Drupal for more than two decades and continues to lead the project today as Drupal's Project Lead.

Categories: FLOSS Project Planets

Real Python: Python 3.13 Preview: A Modern REPL

Planet Python - Wed, 2024-09-25 10:00

One of Python’s strong points is its interactive capabilities. By running python you start the interactive interpreter, or REPL, which allows you to perform quick calculations or explore and experiment with your code. In Python 3.13, the interactive interpreter has been completely redesigned with new modern features.

Python’s REPL has remained largely unchanged for decades. Instead, alternative interpreters like IPython, bpython, and ptpython have addressed some of the built-in REPL’s shortcomings, providing more convenient interactive workflows for developers. As you’re about to learn, Python 3.13 brings many significant improvements to the interactive interpreter.

In this tutorial, you’ll:

  • Run Python 3.13 and explore the new REPL
  • Browse through the help system
  • Work with multiline statements
  • Paste code into your REPL session
  • Navigate through your interpreter history

The upgraded REPL is just one of the new features coming in Python 3.13. You can read about all the changes in the what’s new section of Python’s changelog. Additionally, you can dig deeper into the work done on free threading and a Just-In-Time compiler.

Get Your Code: Click here to download the free sample code that shows you how to use some of the new features in Python 3.13.

A New Interactive Interpreter (REPL) in Python 3.13

To try out the new REPL for yourself, you need to get your hands on a version of Python 3.13. Before the official release in October 2024, you can install a pre-release version. After October 2024, you should be able to install Python 3.13 through any of the regular channels.

A REPL, or a Read-Eval-Print Loop, is a program that allows you to work with code interactively. The REPL reads your input, evaluates it, and prints the result before looping back and doing the same thing again.

Note: To learn about the built-in REPL in Python and how it works in general, check out The Python Standard REPL: Try Out Code and Ideas Quickly.

In any version of Python, you can start this interactive shell by typing the name of your Python executable in your terminal. Typically, this will be python, but depending on your operating system and your setup, you may have to use something like py or python3 instead.

Once you start the REPL in Python 3.13, you’ll see a small but noticeable difference. The familiar Python interactive shell prompt, consisting of three right angle brackets (>>>), is now colored differently from the rest of the text:

The color difference indicates that the shell now supports color. In the new shell, color is mainly used to highlight output in tracebacks. If your terminal doesn’t display color, then the new REPL will automatically detect this and fall back to its plain, colorless display.

If you prefer to keep your interpreter free of color even when it’s supported, you can disable this new feature. One option is to set the new environment variable PYTHON_COLORS to 0:

Shell $ PYTHON_COLORS=0 python Python 3.13.0rc2 (main, Sep 13 2024, 17:09:27) [GCC 9.4.0] on linux Type "help", "copyright", "credits" or "license" for more information. >>> Copied!

Setting PYTHON_COLORS=0 disables color in the REPL. If you set the environment variable to 1, you’ll get the colored prompt and output. However, since it’s the default, this is rarely necessary.

Before going any further, you’ll exit the REPL. As you may know, the old REPL, that you’ve used on Python 3.12 and earlier, has been widely commented on for the following idiosyncrasy:

Python >>> exit Use exit() or Ctrl-D (i.e. EOF) to exit Copied!

The shell clearly understands your intention to end the session. Still, it makes you jump through hoops and add parentheses to your command. In Python 3.13, the REPL now understands special commands that you can write without any parentheses:

  • exit or quit: Exit the interpreter
  • help or F1: Access the help system
  • clear: Clear the screen

Having these commands more easily available is a small thing, but it removes some friction when using the interactive interpreter. You can still use parentheses and type something like exit() if you prefer. These commands are not reserved though. That means that you could shadow them with variable assignments:

Python >>> exit = True >>> exit True >>> exit() Traceback (most recent call last): ... TypeError: 'bool' object is not callable >>> del exit >>> exit Copied!

Here you create a variable named exit, effectively disabling the exit command. To end your REPL session, you can either delete the exit variable or use one of the alternative ways to exit the interpreter.

Note: Even with the new improvements, the built-in REPL is not as powerful as some of the third-party alternatives. If you’re already using any of these, there’s no immediate reason to revert back to the standard interactive interpreter.

The main advantage of the new REPL is that it’s installed together with Python. In other words, it’ll always be available for you. This is great if you find yourself in a situation where you can’t or don’t want to install a third-party library.

You can learn more about the alternative interactive interpreters in the following tutorials:

You can also read more about alternative REPLs in the guide to the standard REPL.

Read the full article at https://realpython.com/python313-repl/ »

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Melissa Wen: Reflections on 2024 Linux Display Next Hackfest

Planet Debian - Wed, 2024-09-25 09:50

Hey everyone!

The 2024 Linux Display Next hackfest concluded in May, and its outcomes continue to shape the Linux Display stack. Igalia hosted this year’s event in A Coruña, Spain, bringing together leading experts in the field. Samuel Iglesias and I organized this year’s edition and this blog post summarizes the experience and its fruits.

One of the highlights of this year’s hackfest was the wide range of backgrounds represented by our 40 participants (both on-site and remotely). Developers and experts from various companies and open-source projects came together to advance the Linux Display ecosystem. You can find the list of participants here.

The event covered a broad spectrum of topics affecting the development of Linux projects, user experiences, and the future of display technologies on Linux. From cutting-edge topics to long-term discussions, you can check the event agenda here.

Organization Highlights

The hackfest was marked by in-depth discussions and knowledge sharing among Linux contributors, making everyone inspired, informed, and connected to the community. Building on feedback from the previous year, we refined the unconference format to enhance participant preparation and engagement.

Structured Agenda and Timeboxes: Each session had a defined scope, time limit (1h20 or 2h10), and began with an introductory talk on the topic.

  • Participant-Led Discussions: We pre-selected in-person participants to lead discussions, allowing them to prepare introductions, resources, and scope.
  • Transparent Scheduling: The schedule was shared in advance as GitHub issues, encouraging participants to review and prepare for sessions of interest.

Engaging Sessions: The hackfest featured a variety of topics, including presentations and discussions on how participants were addressing specific subjects within their companies.

  • No Breakout Rooms, No Overlaps: All participants chose to attend all sessions, eliminating the need for separate breakout rooms. We also adapted run-time schedule to keep everybody involved in the same topics.
  • Real-time Updates: We provided notifications and updates through dedicated emails and the event matrix room.

Strengthening Community Connections: The hackfest offered ample opportunities for networking among attendees.

  • Social Events: Igalia sponsored coffee breaks, lunches, and a dinner at a local restaurant.

  • Museum Visit: Participants enjoyed a sponsored visit to the Museum of Estrela Galicia Beer (MEGA).

Fruitful Discussions and Follow-up

The structured agenda and breaks allowed us to cover multiple topics during the hackfest. These discussions have led to new display feature development and improvements, as evidenced by patches, merge requests, and implementations in project repositories and mailing lists.

With the KMS color management API taking shape, we discussed refinements and best approaches to cover the variety of color pipeline from different hardware-vendors. We are also investigating techniques for a performant SDR<->HDR content reproduction and reducing latency and power consumption when using the color blocks of the hardware.

Color Management/HDR

Color Management and HDR continued to be the hottest topic of the hackfest. We had three sessions dedicated to discuss Color and HDR across Linux Display stack layers.

Color/HDR (Kernel-Level)

Harry Wentland (AMD) led this session.

Here, kernel Developers shared the Color Management pipeline of AMD, Intel and NVidia. We counted with diagrams and explanations from HW-vendors developers that discussed differences, constraints and paths to fit them into the KMS generic color management properties such as advertising modeset needs, IN\_FORMAT, segmented LUTs, interpolation types, etc. Developers from Qualcomm and ARM also added information regarding their hardware.

Upstream work related to this session:

Color/HDR (Compositor-Level)

Sebastian Wick (RedHat) led this session.

It started with Sebastian’s presentation covering Wayland color protocols and compositor implementation. Also, an explanation of APIs provided by Wayland and how they can be used to achieve better color management for applications and discussions around ICC profiles and color representation metadata. There was also an intensive Q&A about LittleCMS with Marti Maria.

Upstream work related to this session:

Color/HDR (Use Cases and Testing)

Christopher Cameron (Google) and Melissa Wen (Igalia) led this session.

In contrast to the other sessions, here we focused less on implementation and more on brainstorming and reflections of real-world SDR and HDR transformations (use and validation) and gainmaps. Christopher gave a nice presentation explaining HDR gainmap images and how we should think of HDR. This presentation and Q&A were important to put participants at the same page of how to transition between SDR and HDR and somehow “emulating” HDR.

We also discussed on the usage of a kernel background color property.

Finally, we discussed a bit about Chamelium and the future of VKMS (future work and maintainership).

Power Savings vs Color/Latency

Mario Limonciello (AMD) led this session.

Mario gave an introductory presentation about AMD ABM (adaptive backlight management) that is similar to Intel DPST. After some discussions, we agreed on exposing a kernel property for power saving policy. This work was already merged on kernel and the userspace support is under development.

Upstream work related to this session:

Strategy for video and gaming use-cases

Leo Li (AMD) led this session.

Miguel Casas (Google) started this session with a presentation of Overlays in Chrome/OS Video, explaining the main goal of power saving by switching off GPU for accelerated compositing and the challenges of different colorspace/HDR for video on Linux.

Then Leo Li presented different strategies for video and gaming and we discussed the userspace need of more detailed feedback mechanisms to understand failures when offloading. Also, creating a debugFS interface came up as a tool for debugging and analysis.

Real-time scheduling and async KMS API

Xaver Hugl (KDE/BlueSystems) led this session.

Compositor developers have exposed some issues with doing real-time scheduling and async page flips. One is that the Kernel limits the lifetime of realtime threads and if a modeset takes too long, the thread will be killed and thus the compositor as well. Also, simple page flips take longer than expected and drivers should optimize them.

Another issue is the lack of feedback to compositors about hardware programming time and commit deadlines (the lastest possible time to commit). This is difficult to predict from drivers, since it varies greatly with the type of properties. For example, color management updates take much longer.

In this regard, we discusssed implementing a hw_done callback to timestamp when the hardware programming of the last atomic commit is complete. Also an API to pre-program color pipeline in a kind of A/B scheme. It may not be supported by all drivers, but might be useful in different ways.

VRR/Frame Limit, Display Mux, Display Control, and more… and beer

We also had sessions to discuss a new KMS API to mitigate headaches on VRR and Frame Limit as different brightness level at different refresh rates, abrupt changes of refresh rates, low frame rate compensation (LFC) and precise timing in VRR more.

On Display Control we discussed features missing in the current KMS interface for HDR mode, atomic backlight settings, source-based tone mapping, etc. We also discussed the need of a place where compositor developers can post TODOs to be developed by KMS people.

The Content-adaptive Scaling and Sharpening session focused on sharpening and scaling filters. In the Display Mux session, we discussed proposals to expose the capability of dynamic mux switching display signal between discrete and integrated GPUs.

In the last session of the 2024 Display Next Hackfest, participants representing different compositors summarized current and future work and built a Linux Display “wish list”, which includes: improvements to VTTY and HDR switching, better dmabuf API for multi-GPU support, definition of tone mapping, blending and scaling sematics, and wayland protocols for advertising to clients which colorspaces are supported.

We closed this session with a status update on feature development by compositors, including but not limited to: plane offloading (from libcamera to output) / HDR video offloading (dma-heaps) / plane-based scrolling for web pages, color management / HDR / ICC profiles support, addressing issues such as flickering when color primaries don’t match, etc.

After three days of intensive discussions, all in-person participants went to a guided tour at the Museum of Extrela Galicia beer (MEGA), pouring and tasting the most famous local beer.

Feedback and Future Directions

Participants provided valuable feedback on the hackfest, including suggestions for future improvements.

  • Schedule and Break-time Setup: Having a pre-defined agenda and schedule provided a better balance between long discussions and mental refreshments, preventing the fatigue caused by endless discussions.
  • Action Points: Some participants recommended explicitly asking for action points at the end of each session and assigning people to follow-up tasks.
  • Remote Participation: Remote attendees appreciated the inclusive setup and opportunities to actively participate in discussions.
  • Technical Challenges: There were bandwidth and video streaming issues during some sessions due to the large number of participants.

Thank you for joining the 2024 Display Next Hackfest

We can’t help but thank the 40 participants, who engaged in-person or virtually on relevant discussions, for a collaborative evolution of the Linux display stack and for building an insightful agenda.

A big thank you to the leaders and presenters of the nine sessions: Christopher Cameron (Google), Harry Wentland (AMD), Leo Li (AMD), Mario Limoncello (AMD), Sebastian Wick (RedHat) and Xaver Hugl (KDE/BlueSystems) for the effort in preparing the sessions, explaining the topic and guiding discussions. My acknowledge to the others in-person participants that made such an effort to travel to A Coruña: Alex Goins (NVIDIA), David Turner (Raspberry Pi), Georges Stavracas (Igalia), Joan Torres (SUSE), Liviu Dudau (Arm), Louis Chauvet (Bootlin), Robert Mader (Collabora), Tian Mengge (GravityXR), Victor Jaquez (Igalia) and Victoria Brekenfeld (System76). It was and awesome opportunity to meet you and chat face-to-face.

Finally, thanks virtual participants who couldn’t make it in person but organized their days to actively participate in each discussion, adding different perspectives and valuable inputs even remotely: Abhinav Kumar (Qualcomm), Chaitanya Borah (Intel), Christopher Braga (Qualcomm), Dor Askayo (Red Hat), Jiri Koten (RedHat), Jonas Ådahl (Red Hat), Leandro Ribeiro (Collabora), Marti Maria (Little CMS), Marijn Suijten, Mario Kleiner, Martin Stransky (Red Hat), Michel Dänzer (Red Hat), Miguel Casas-Sanchez (Google), Mitulkumar Golani (Intel), Naveen Kumar (Intel), Niels De Graef (Red Hat), Pekka Paalanen (Collabora), Pichika Uday Kiran (AMD), Shashank Sharma (AMD), Sriharsha PV (AMD), Simon Ser, Uma Shankar (Intel) and Vikas Korjani (AMD).

We look forward to another successful Display Next hackfest, continuing to drive innovation and improvement in the Linux display ecosystem!

Categories: FLOSS Project Planets

Mike Driscoll: JupyterLab 101 Kickstarter Stretch Goal

Planet Python - Wed, 2024-09-25 09:26

My Kickstarter for my latest Python book is still going on for another eight days. Now is a great time to pre-order the book as well as get my other Python books.

The project is fully funded, and I added a stretch goal.

Stretch Goal

The stretch goal is $5000 or 350 backers. If we reach either of those, I will create a video course version of the book and release it to all backers. I think being able to show JupyterLab in a video format may be an even better medium for teaching this type of topic, although the book may be better for quick browsing.

What You’ll Learn

In this book, you will learn how about the following:

  • Installation and setup of JupyterLab
  • The JupyterLab user interface
  • Creating a Notebook
  • Markdown in Notebooks
  • Menus in JupyterLab
  • Launching Other Applications (console, terminal, text files, etc)
  • Distributing and Exporting Notebooks
  • Debugging in JupyterLab
  • Testing your notebooks
Rewards to Choose From

As a backer of this Kickstarter, you have some choices to make. You can receive one or more of the following, depending on which level you choose when backing the project:

  • An early copy of JupyterLab 101 + all updates including the final version (ALL BACKERS)
  • signed paperback copy (If you choose the appropriate perk)
  • Get all by Python courses hosted on Teach Me Python or another site  (If you choose the appropriate perk)
  • T-shirt with the book cover  (If you choose the appropriate perk)

Get the book on Kickstarter today!

The post JupyterLab 101 Kickstarter Stretch Goal appeared first on Mouse Vs Python.

Categories: FLOSS Project Planets

1xINTERNET blog: Community survey analysis and revised implementation of search in Drupal CMS (formerly Starshot)

Planet Drupal - Wed, 2024-09-25 08:00

As track leads for search in Drupal CMS (formerly Starshot), we developed a concept and aligned it with the other leaders. To refine our approach, we conducted a survey to gather feedback and identify potential improvements. This article presents the survey results and the planned implementation.

Categories: FLOSS Project Planets

The Dot is closed. Long live KDE Planet!

Planet KDE - Wed, 2024-09-25 05:45



As KDE grows, so does the interest in each of its projects. Gathering all KDE news in one place no longer works. The volume of updates coming from the KDE community as a whole has become too large to cover in its entirety on the Dot. With this in mind, we are archiving the Dot, but keeping its content accessible for historical reasons.

The news coming out of the community was curated and edited for the Dot. The current rate of news items being published today would've not only made that impractical, but would have also led to things being unjustly left out, giving only a partial view of what was going on.

But we are not leaving you without your source of KDE news! We have figured out something better: we have worked with KDE webmasters to set up a blogging system for contributors. You can now access Announcements, Akademy, the Association news, and the news from your favorite projects directly, unfiltered, unedited, straight from the source.

Or... If you want to keep up with what is going on in ALL KDE projects and news on a daily (often hourly) basis, use the Planet! Access it on the web or add an RSS feed to your reader. You can also follow KDE news as it happens in our Discuss forums and talk about it live with the rest of the community. You can even follow @planet.kde.org@rss-parrot.net on Mastodon to stay up to date.

If you just want the highlights, check out our social media:

Categories: FLOSS Project Planets

Talk Python to Me: #478: When and how to start coding with kids

Planet Python - Wed, 2024-09-25 04:00
Do you have kids? Maybe nieces and nephews? Or maybe you work in a school environment? Maybe it's just friend's who know you're a programmer and ask about how they should go about introducing programming concepts with them. Anna-Lena Popkes is back on the show to share her research on when and how to teach kids programming. We spend the second half of the episode talking about concrete apps and toys you might consider for each age group. Plus, some of these things are fun for adults too. ;)<br/> <br/> <strong>Episode sponsors</strong><br/> <br/> <a href='https://talkpython.fm/workos'>WorkOS</a><br> <a href='https://talkpython.fm/training'>Talk Python Courses</a><br/> <br/> <strong>Links from the show</strong><br/> <br/> <div><b>Anna-Lena</b>: <a href="https://alpopkes.com?featured_on=talkpython" target="_blank" >alpopkes.com</a><br/> <br/> <b>Magical universe repo</b>: <a href="https://github.com/zotroneneis/magical_universe?featured_on=talkpython" target="_blank" >github.com</a><br/> <b>Machine learning basics repo</b>: <a href="https://github.com/zotroneneis/machine_learning_basics?featured_on=talkpython" target="_blank" >github.com</a><br/> <br/> <b>PyData recording "when and how to start coding with kids"</b>: <a href="https://www.youtube.com/embed/aTum4tV4geY" target="_blank" >youtube.com</a><br/> <br/> <b>Robots and devices</b><br/> <b>Bee Bot</b>: <a href="https://www.terrapinlogo.com/beebot-ss.html?featured_on=talkpython" target="_blank" >terrapinlogo.com</a><br/> <b>Cubelets</b>: <a href="https://modrobotics.com?featured_on=talkpython" target="_blank" >modrobotics.com</a><br/> <b>BBC Microbit</b>: <a href="https://microbit.org?featured_on=talkpython" target="_blank" >microbit.org</a><br/> <b>RaspberryPi</b>: <a href="https://www.raspberrypi.com?featured_on=talkpython" target="_blank" >raspberrypi.com</a><br/> <b>Adafruit Qualia ESP32 for CircuitPython</b>: <a href="https://www.adafruit.com/product/5800?featured_on=talkpython" target="_blank" >adafruit.com</a><br/> <b>Zumi</b>: <a href="https://www.robolink.com/products/zumi?featured_on=talkpython" target="_blank" >robolink.com</a><br/> <br/> <b>Board games</b><br/> <b>Think Fun Robot Turtles Board Game</b>: <a href="https://www.amazon.com/Think-Fun-Turtles-Coding-Preschoolers/dp/B00HN2BXUY/?featured_on=talkpython" target="_blank" >amazon.com</a><br/> <br/> <b>Visual programming:</b><br/> <b>Scratch Jr.</b>: <a href="https://www.scratchjr.org?featured_on=talkpython" target="_blank" >scratchjr.org</a><br/> <b>Scratch</b>: <a href="https://www.scratch.org?featured_on=talkpython" target="_blank" >scratch.org</a><br/> <b>Blocky</b>: <a href="https://developers.google.com/blockly?featured_on=talkpython" target="_blank" >google.com</a><br/> <b>Microbit's Make Code</b>: <a href="https://makecode.microbit.org?featured_on=talkpython" target="_blank" >microbit.org</a><br/> <b>Code Club</b>: <a href="https://codeclubworld.org?featured_on=talkpython" target="_blank" >codeclubworld.org</a><br/> <br/> <b>Textual programming</b><br/> <b>Code Combat</b>: <a href="https://codecombat.com?featured_on=talkpython" target="_blank" >codecombat.com</a><br/> <b>Hedy</b>: <a href="https://www.hedycode.com?featured_on=talkpython" target="_blank" >hedycode.com</a><br/> <b>Anvil</b>: <a href="https://anvil.works?featured_on=talkpython" target="_blank" >anvil.works</a><br/> <br/> <b>Coding classes / summer camps (US)</b><br/> <b>Portland Community College Summer Teen Program</b>: <a href="https://www.pcc.edu/community/special-classes/teen/?featured_on=talkpython" target="_blank" >pcc.edu</a><br/> <b>Watch this episode on YouTube</b>: <a href="https://www.youtube.com/watch?v=bPaseFf1Hio" target="_blank" >youtube.com</a><br/> <b>Episode transcripts</b>: <a href="https://talkpython.fm/episodes/transcript/478/when-and-how-to-start-coding-with-kids" target="_blank" >talkpython.fm</a><br/> <br/> <b>--- Stay in touch with us ---</b><br/> <b>Subscribe to us on YouTube</b>: <a href="https://talkpython.fm/youtube" target="_blank" >youtube.com</a><br/> <b>Follow Talk Python on Mastodon</b>: <a href="https://fosstodon.org/web/@talkpython" target="_blank" ><i class="fa-brands fa-mastodon"></i>talkpython</a><br/> <b>Follow Michael on Mastodon</b>: <a href="https://fosstodon.org/web/@mkennedy" target="_blank" ><i class="fa-brands fa-mastodon"></i>mkennedy</a><br/></div>
Categories: FLOSS Project Planets

Metadrop: Improving headers in a Drupal site using Dries' HTTP Header Analyzer

Planet Drupal - Wed, 2024-09-25 01:30

In a previous article I wrote about the importance of the HTTP headers and web security avoiding the technical stuff. In this article I want to get into all the dirty technical details. 

Some time ago we came across the HTTP Header Analyzer by Dries Buytaert. This tool, as the name suggests, analyses the headers of the HTTP response from a website. There are other header analysers out there, but this one, published by the creator of Drupal, also takes into account Drupal-specific headers. The tool displays a report of all headers found, along with an explanation of the purpose of the header and notes on the values of the header, sometimes including recommendations for better values. It also displays information about missing headers that should be present. 

With all this information, the tool gives a score based on the missing headers, warnings and notices detected during the analysis. On the first runs on our website, I have to admit that the score was not bad, but not good: 6/10. Now I am happy to say that we get a score of 10/10 on the home page. Unfortunately, not all pages can get the highest score, as I explain below.

Categories: FLOSS Project Planets

The Drop Times: DrupalCon Barcelona 2024 Day 1: Awards, Driesnote, and Community

Planet Drupal - Wed, 2024-09-25 01:20
DrupalCon Barcelona 2024 kicked off with contribution workshops, the Women in Drupal award ceremony, and Dries Buytaert’s 40th State of Drupal address. Key announcements included the upcoming Drupal CMS 1.0 launch, AI integration, and the Experience Builder initiative.
Categories: FLOSS Project Planets

Specbee: Why You Can’t Ignore Google Crawlability, Indexability & Mobile-First Indexing in SEO

Planet Drupal - Wed, 2024-09-25 00:52
Do you ever feel like you’re living ahead of your time when you watch those world-conquering AI movies? The only reason such movies are gaining popularity with every passing day is because a similar reality is not too far away. The world today is living wire-free and digitally. Online is the popular and future-demanding way of living. And for your business to prosper in this online world, mastering SEO is a necessity! Now, to achieve optimal visibility, you need to understand the three most important concepts: crawlability, indexability, and mobile-first indexing. Why all of them? These three concepts are codependent, and they make the most crucial pillars to determining how search engines discover, understand, and rank your website. Your goal here is to make your website stand out in this ever-competitive online space; your tool: SEO. In this blog, we focus on the what and why of all three concepts and the best practices to help your website rank higher on Google search results. Introduction to Crawlability, Indexability, and Mobile-First Indexing Each of these three concepts is connected to the other and therefore, can get confusing for the masses to understand the difference without proper analysis. Crawlability refers to the process search engines like Google use to discover web pages easily. In simple words, Google discovers web pages by crawling, using computer programs called web crawlers or bots. Indexing is the process that follows crawling. Indexability refers to the process where search engines can add pages to their index. Google indexes by analyzing pages and their content to add them to the database of zillions of pages. Mobile-first indexing refers to Google’s crawling process as it prioritizes the mobile version of website content over its website version to inform higher rankings. Why These Concepts Matter in Today’s SEO Landscape With the continuous evolution of Google and other search engines, your SEO strategies need to be top-notch. With Google prioritizing mobile traffic, and hence, its mobile-first initiative, optimizing your website for both crawlability and indexability is the need of the hour. Without proper measures implemented in these areas, your website may be heading towards stunted organic growth. And you don’t wish for that to happen, do you? So, go on reading till the end. What is Crawlability in SEO? Performed by crawler bots, crawlability is the ability of search engines to find and index web content. Google crawl bots are essential computer programs that analyze websites and collect their respective information. This information helps Google index websites and show relevant search results. Google crawl bots follow a list of web page URLs based on previous crawls and sitemaps. As crawl bots visit a page, they detect links on each page and add those links to the list of pages to crawl. These new sites then become a part of SERP results and change to existing sites. Furthermore, dead links are spotted and taken in to update the Google index. However,  it’s important to know how to get Google to crawl your site. If your website is not crawlable, chances are, some of your web pages won’t be indexed, leading to a negative impact on your website’s visibility and organic traffic, thus, affecting your site’s SEO numbers. Key Factors Affecting Crawlability While crawlability is crucial for optimizing your site’s visibility, there are several factors impacting it. Learn about some of the most crucial ones are below: Site structure – Your site structure plays a pivotal role in its crawlability. If any of your pages aren’t linked to any source, it won’t be easily accessible to crawl bots for crawling. Internal links – Web crawlers follow links while crawling your site. These crawl bots only find pages linked to other content. To get Google to crawl your site, make sure to have internal linking structures across your website, allowing crawl bots to easily spot these pages in your site structure. Robots.txt and Meta Tags – These two parameters are crucial to maximizing your crawl budget. Make sure to avoid commands that prevent crawlers from crawling your pages. Check your meta tags and robots.txt file to avoid common errors:       ◦ <meta name="robots" content="noindex"/>: This robots meta tag directive can block your page from indexing.       ◦ <meta name="robots" content="nofollow">: Although nofollow links allow your pages to be crawled, it exudes all permissions for crawlers to follow links on your page.       ◦ User-agent: * Disallow: /: This particular code bars all your pages from getting indexed. Crawl Budget – Google offers limited recommendations for crawl budget optimizations since Google crawl bots finish the job without reaching their limit. But B2B and e-commerce sites with several hundreds of landing pages can be at the risk of running out of their crawl budget, missing the chance of your web pages making their way to Google’s index for crawlability. Tools to Check and Improve Crawlability There are various tools in the digital realm that claim to improve crawlability. Let me highlight a few most popular tools you can use to improve crawlability: Google Search Console What better tool to analyze Google crawlability than its in-house tool? Although it’s not a surprise to know how often does Google crawl a site, i.e., once every 3 days to 4 weeks, there are no fixed times.  How to fix crawl errors in Google Search Console? You can analyze your page traffic, keyword data, page traffic broken down by device, and page rankings in SERP results. Google Search Console also provides you with insights on how to rank higher on Google search results following Google algorithms. Submit sitemaps and provide comprehensive crawl, index, and related information directly from the Google index. Screaming Frog SEO Spider This is the tool you want to use to take charge of two essential jobs at once: crawl your website and assist with SEO audits. This web crawler helps increase the chances of your web content appearing in search engine results. Analyze page titles, and meta descriptions, and discover duplicate pages – all the actions that impact your site’s SEO. The free version can crawl up to 500 pages on a website and can find if any pages are redirected or if your site has the dreaded 404 errors. If you need more detailed features, the paid version brings many to the table, including JavaScript rendering, Google Analytics integration, custom source code search, and more. XML Sitemaps A roadmap for search engine crawlers, XML sitemaps ensure crawlers reach the important pages that need to be crawled and indexed. For better crawlability, ensure that the following requirements are met while creating sitemaps: Choose Canonical URLs: Identify and include only your preferred URLs (canonical URLs) in the sitemap to avoid duplicates from the same content accessible via multiple URLs. CMS-generated Sitemaps: Most modern CMS platforms can automatically generate sitemaps. For platforms like Drupal, you get to apply the benefits of the Drupal XML sitemap module that allows you to create XML sitemaps adhering to sitemaps.org guidelines. Check your CMS documentation for specific instructions on sitemap creation. Manual Sitemaps for Small Sites: For websites with fewer than a few dozen URLs, create a sitemap manually using a text editor and follow standard sitemap syntax. Automatic Sitemap Generation for Large Sites: For larger websites, use tools or your website’s database to automatically generate sitemaps. Make sure to split your sitemap into smaller files if necessary.Submit Sitemap to Google: Submit the sitemap using Google Search Console or the Search Console API. You can also include the sitemap URL in your robots.txt file for easy discovery. Bear in mind that submitting a sitemap doesn’t ensure Google will crawl every URL. It serves as a hint to Google but does not guarantee URL crawling or indexing. What is Google Indexing in SEO? The process of adding pages of a website to a search engine’s database to display them in search results when relevant is called indexing. Search engines like Google, crawl websites and follow links on the sites to add new pages. They analyze their content to determine its quality and relevance. When search engines find quality and relevant content, they add those pages to their index. One of the main processes of the Google search engine is indexing. Google Indexing allows it to discover and analyze web content and display them as search results to users, resulting in higher SEO ranks. Factors That Influence Indexability Given the importance of indexability for search engines to work properly, it is only understood that several factors can hinder or contribute to its influence. Below is a list of the common factors that influence the indexability of website pages: Content Quality and Relevance - Search engine bots or bot crawlers are fonder of high-quality content. Well-written, original, informative, and content with clear headings and organized structure that is relevant to users make it to the list of indexed pages in search engines. Canonical Tags and Duplicate Content - Search engines cannot index duplicate content. Duplicate content raises confusion during Google page indexing. Therefore, canonical tags are the counter activity here to solve duplicate content issues. These tags help search engines distinguish between original and duplicate content while indexing the relevant content pages. Technical Issues – Technical issues like broken links, page load times, noindex tags or directives, redirect loops, etc., can contribute to a negative impact on SEO, preventing crawling or indexing your site content. Using Google Search Console for Indexing Your Site Various tools allow you to check whether or not your web pages are crawlable or indexed. However, Google being a giant search engine, it’s highly recommended to use Google indexing service to meet your SEO goals relatively easily. Let me walk you through a brief guide as to how you can use Google Search Console request indexing for your website. Verify Website Ownership: Confirm ownership by adding an HTML file or meta tag. Submit Your Sitemap: Submit your sitemap URL in the Sitemaps section to help Google discover your significant pages faster. Use the URL Inspection Tool: Submit specific pages for re-crawling or direct indexing requests. Monitor Coverage Issues: Use the "Coverage" report to identify and fix indexing issues like blocked pages. Improve Internal Linking: Ensure proper internal links for better crawling. Enhance Content Quality: Avoid low-quality or duplicate content to improve indexing chances. Fixing Common Google Indexing Issues Here are some common Google indexing issues (similar and in addition to those of crawlability issues) you may come across on your website that affect its visibility and performance in SERP results: Low-Quality Content Google always targets original and relevant content to rank highest in user search results. If your web content is low quality, scraped, or includes keyword stuffing, chances are, that Google won’t index your web pages. Make sure to cater to the needs of your users and provide relevant information to them that is unique and aligned with the Webmaster Guidelines. Not Mobile-Friendly Google prioritizes mobile-friendliness while crawling web pages. If your website content isn’t mobile-friendly, Google is likely to not index it. Make sure to implement an adaptable design, improve load times, compress images, get rid of pop-ups, and keep finder reach in mind while curating your website. Missing Sitemap Without proper sitemaps, you cannot get Google to crawl your site. Make sure to use XML sitemaps instead of HTML. It helps enhance your website performance in search engines. Once you’ve created the sitemap, submit it with the help of Google Search Console or robots.txt file. Poor Site Structure Websites with poor navigability are often missed by crawl bots. Poor site structure negatively impacts your site crawlability and indexability. Make sure to use a clear website structure and good internal linking for SEO purposes. Redirect Loops Redirect loops block Google crawl bots from indexing your pages correctly, as these bots indulge in these loops and cannot crawl your website. Check your .htaccess or HTML sources of your website for unintentional or incorrect redirect loops. 301 redirects for permanent pages and 302 redirects for temporary pages should be used to prevent this issue from occurring. What is the difference between crawlability and indexability Since these two concepts are connected, oftentimes people mistake one for the other.Crawling is the process of search engines accessing and reading web pages. On the contrary, indexing is the process of adding pages to Google index. Indexing comes after crawling. However, in selective cases, indexing website on Google or other search engine bots can take place without reading the content. What is Mobile-First Indexing As the name suggests, mobile-first indexing means Google crawl bots prioritize indexing the mobile version of website content over desktop. This Google mobile-first indexing is meant to inform rankings. If your website content is not optimized for mobile devices, it could hamper your SEO ranks, even for desktop users. Having mobile-friendly content gives your website the edge for getting recognized for relevance and accessibility, thus improving visibility in search results. Mobile-first indexing is highly prioritized today to meet the expectations of the growing number of mobile users. For a seamless mobile browsing experience, it is crucial to have relevant, navigable content to reduce bounce rates and improve the overall SEO performance of your website. The Impact of Mobile-First Indexing on SEO Mobile-first indexing is a necessity today. It highly impacts the implications of crawlability and indexability, necessary for SEO purposes. Optimize your website for mobile devices using an adaptable design. You can develop your website specially for mobile devices using a responsive web design with a mobile-friendly theme or a completely separate mobile website. Alternatively, you can develop app-based solutions or AMP pages to enhance your website performance on mobile devices. Small businesses, e-commerce and B2B websites, and enterprise-level organizations stand highly affected by this shift to mobile-first indexing. It’s high time for these organizations to develop mobile-friendly websites to meet their customers’ expectations. Additionally, SEO professionals have witnessed a surge in demand for mobile-responsive website design and development services. Therefore, aligning your SEO strategies to Google’s algorithm to stand alongside your competitors in this mobile-first landscape is key. Are mobile-first indexing and mobile usability the same? Definitely not. A website may or may not be usable on mobile devices, despite containing content that calls for mobile-first indexing. Best Practices for Optimizing Crawlability and Indexability in a Mobile-First World In a mobile-first world, it is important to optimize your web pages in the direction of gaining greater visibility and better performance in search engine results. Below are a few best practices supported by Google and other search engines that can help improve the crawlability and indexability of your website for Google mobile-first indexing as a priority in the present scenario. Mobile-friendliness: Effective responsive designs serving the same HTML code on the same URL, irrespective of the device used, is key to optimizing mobile-friendliness. Your site should adapt to the screen size automatically. According to Google, adaptable web designs work best for ease and maintenance. Navigation: Mobile users often use one hand to operate their devices and therefore, make sure to design your website with liquid navigation. Implement larger buttons, and menus, add ample spaces, and additional interactive elements for easy finger-friendly navigation. A pro tip: maintain a minimum tap target size of 48 x 48 px. Content Consistency: Use the same meta robot tags for the desktop and mobile versions of your website. Ensure that your robot.txt file has not barred access to the main content, specifically on mobile devices. Google crawl bots can crawl and index content only when mobile-first indexing is enabled on your website. Page Speed Optimization: Avoid lazy loading of web content. Slow loading times hamper user experience for mobile users. Make sure to optimize your web images without compromising quality. Minimize HTTP requests and leverage browser caching. Ensure faster loading times for your web content across the globe with the help of CDN distribution. Tools and Techniques for Ongoing Optimization With mobile-first indexing in mind, it only makes sense to stay updated with the ongoing trends to check your site for mobile-friendliness. Here are a couple of ways to do the necessary: Use Mobile-Friendly Test Tools with Google Search Console Open your account in Google Search Console and click on “Mobile-Friendly Test” in the Enhancements sections. Input the URL to be analyzed for mobile-friendliness and proceed with the “Test URL” button. Review the report generated that presents you with the result of whether or not the page is mobile-friendly and also offers suggestions to address any issues found. If not mobile-friendly, the list of issues may include small text, appearance issues, content too wide for the size of the screen, or other similar and/or related usability problems. Once you’ve addressed these issues, re-test the page using the Mobile-Friendly Test tool. Run regular audits like this on Google Search Console to ensure the mobile-friendliness of pages across your site. Optimize Core Web Vitals Pay attention to page speed, interactivity, and visual stability to align with Google’s Core Web Vitals. One of the ways to go about it is by utilizing the PageSpeed Insights tool by Google. It allows you to examine web pages and provides data based on both mobile and desktop content versions, along with improvement suggestions. You can simply input the URL in the tool and click on “Analyze.” Proceed from here accordingly to generate a report that checks your website’s mobile-friendliness with a Core Web Vitals Assessment. Higher metrics mean greater mobile responsiveness! Common Pitfalls to Avoid in Mobile-First Indexing Ignoring the Mobile Version of Your Site: Make sure all content, inclusive of text and media, is accessible not only on the desktop version of your site but also on mobile devices. Maintain content parity and SEO rankings. Overlooking Technical SEO on Mobile: Mobile content also needs indexing for SEO purposes. Ensure that both your mobile and desktop sites use the same structured data for consistency. Doing so helps search engines properly analyze your web content. Slow Mobile Page Speed: Compress images, leverage browser caching, and minimize code to enhance the page loading speed for your mobile site. This way, you deliver a seamless user experience for mobile users. Final thoughts Crawlability and indexability are two significant entities that shape the SEO performance of your website. With Google prioritizing mobile-first indexing today, it’s time to roll up your sleeves and strategize a brand-new mobile-focused SEO strategy that aligns with Google’s algorithm and highlights your brand identity. With AI and machine learning increasingly shaping SEO today, optimizing for mobile devices isn’t just about adapting to algorithms—it’s about meeting user expectations and staying competitive. A mobile-friendly site with fast loading times, responsive design, and consistent content will improve search rankings and user experience. As mobile usage grows, embracing mobile-first strategies ensures businesses stay ahead, enhancing both visibility and customer satisfaction in the evolving digital landscape.
Categories: FLOSS Project Planets

Krita 5.2.5 Released!

Planet KDE - Tue, 2024-09-24 20:00

Krita 5.2.5 is here, bringing over 50 bugfixes since 5.2.3 (5.2.4 was a Windows-specific hotfix). Major fixes have been done to audio playback, transform mask calculation and more!

In addition to the core team, special thanks to Maciej Jesionowski, Ralek Kolemios, Freya Lupen, Michael Genda, Rasyuqa A. H., Simon Ra and Sam James for a variety of fixes!

Changes since 5.2.3:
  • Correctly adjust audio playback when animation framerate is changed.
  • Fix no layer being activated on opening a document (Bug 490375)
  • [mlt] Fix incorrect usage of get_frame API (Bug 489146)
  • Fix forbidden cursor blinking when switching tools with shortcuts (Bug 490255)
  • Fix conflicts between mouse and touch actions requested concurrently (Bug 489537)
  • Only check for the presence of bt2020PQColorSpace on Windows (Bug 490301)
  • Run macdeployqt after searching for missing libs (Bug 490181)
  • Fix crash when deleting composition
  • Fix scaling down image with 1px grid spacing (Bug 490898)
  • Fix layer activation issue when opening multiple documents (Bug 490843)
  • Make clip-board pasting code a bit more robust (Bug 490636)
  • Fix a number of issues with frame generation (Bug 486417)
  • A number of changes related to qt6 port changes.
  • Fix black canvas appearing when "Limit animation frame size" is active (Bug 486417)
  • WebP: fix colorspace export issue when dithering is enabled (Bug 491231)
  • WebP: preserve color profile on export if color model is RGB(A)
  • Fix layer selection when a layer was removed while view was inactive
  • Fix On-Canvas Brush Editor's decimal sliders (Bug 447800, Bug 457744)
  • Make sure file layers are updated when image size or resolution changes (Bug 467257, Bug 470110)
  • Fix Advanced Export of the image with filter masks or layer styles (Bug 476980)
  • Avoid memory leak in the advanced export function
  • Fix mipmaps not being regenerated after transformation was finished or cancelled (Bug 480973)
  • [Gentoo] Don't use xsimd::default_arch in the pixel scaler code
  • KisZug: Fix ODR violation for map_*
  • Fix a crash in Filter Brush when changing the filter type (Bug 478419)
  • PSD: Don't test reference layer for homogenous check (Bug 492236)
  • Fix an assert that should have been a safe assert (Bug 491665)
  • Set minimum freetype version to 2.11 (Bug 489377)
  • Set Krita Default on restoring defaults (Bug 488478)
  • Fix loading translated news (Bug 489477)
  • Make sure that older files with simple transform masks load fine & Fix infinite loop with combination of clone + transform-mask-on-source (Bug 492320)
  • Fix more cycling updates in clone/transform-masks combinations (Bug 443766)
  • Fix incorrect threaded image access in multiple clone layers (Bug 449964)
  • TIFF: Ignore resolution if set to 0 (Bug 473090)
  • Specific Color Selector: Update labels fox HSX (Bug 475551)
  • Specific Color Selector: Fix RGB sliders changing length (Bug 453649)
  • Specific Color Selector: Fix float slider step 1 -> 0.01
  • Specific Color Selector: Fix holding down spinbox arrows (Bug 453366)
  • Fix clone layers resetting the animation cache (Bug 484353)
  • Fix an assert when trying to activate an image snapshot (Bug 492114)
  • Fix redo actions to appear when undoing juggler-compressed actions (Bug 491186)
  • Update cache when cloning perspective assistants (Bug 493185)
  • Fix a warning on undoing flattening a group (Bug 474122)
  • Relink clones to the new layer when flattening (Bug 476514)
  • Fix onion skins rendering on layers with a transform masks (Bug 457136)
  • Fix perspective value for hovering pixel
  • Fix Move and Transform tool to work with Pass-Through groups (Bug 457957)
  • JPEG XL: Export: implement streaming encoding and progress reporting
  • Deselect selection when pasting from the clipboard (Bug 459162)
Download Windows

If you're using the portable zip files, just open the zip file in Explorer and drag the folder somewhere convenient, then double-click on the Krita icon in the folder. This will not impact an installed version of Krita, though it will share your settings and custom resources with your regular installed version of Krita. For reporting crashes, also get the debug symbols folder.

Note: We are no longer making 32-bit Windows builds.

Linux

The separate gmic-qt AppImage is no longer needed.

(If, for some reason, Firefox thinks it needs to load this as text: right-click on the link to download.)

MacOS

Note: We're not supporting MacOS 10.13 anymore, 10.14 is the minimum supported version.

Android

We consider Krita on ChromeOS as ready for production. Krita on Android is still beta. Krita is not available for Android phones, only for tablets, because the user interface requires a large screen.

Source code md5sum

For all downloads, visit https://download.kde.org/stable/krita/5.2.5/ and click on "Details" to get the hashes.

Key

The Linux AppImage and the source .tar.gz and .tar.xz tarballs are signed. You can retrieve the public key here. The signatures are here (filenames ending in .sig).

Categories: FLOSS Project Planets

Pages