Feeds

PreviousNext: Filtering and sorting search results by date range with OpenSearch

Planet Drupal - Tue, 2024-06-04 20:53

How to make use of OpenSearch (and Elasticsearch) built-in data-types specifically designed for filtering and sorting date ranges.


 

by lee.rowlands / 5 June 2024

It's fairly common to have content with a date range. E.g., events or content that is only applicable for a given date.

Sorting and filtering results by date can be complicated, especially if you have to support features such as multiple date ranges and optional end dates.

But OpenSearch (and Elasticsearch) have data types specifically designed for this use case.

The key here is to index your data using a Range field type.

Let's consider an example we recently built for NSW.gov.au. The requirements for the field were as follows:

  • Index a grant content-type
  • The grant content-type has a date-range field
  • The field has optional end dates via the Optional End Date module
  • The field is multi value
  • There is also a boolean 'Grant is ongoing' field that flags the grant as 'Evergreen'

The filtering requirements include a 'Status' field, which has options for:

  • Open - the current date must fall inside one of the date-range values
  • Closed - the current date is after date-range values
  • Opening soon - the current date falls inside a future date range

Additionally, the search results should be sorted in the following order when no keywords exist:

  • Open
  • Opening soon
  • Closed
Defining a date range field in Drupal/Search API

The first step is to inform Search API about the date-range data type. How you do this will depend on whether you're using Elasticsearch connector or Search API OpenSearch.

If you're using Search API OpenSearch, the good news is that it is already supported in the 2.x branch.

If you're using Elasticsearch connector, you should implement hook_elasticsearch_connector_supported_data_types_alter and add date range as a data-type - something like this

/**  * Implements hook_elasticsearch_connector_supported_data_types_alter().  */ function YOURMODULE_elasticsearch_connector_supported_data_types_alter(array &$data_types) {   if (!in_array('date_range', $data_types)) {     $data_types[] = 'date_range';   } }

If you're using Elasticsearch connector, you also need a processor plugin to put the field values into the index. You can take inspiration from how Search API OpenSearch achieves this.

Indexing Drupal data

Once you have those pieces in place, you simply need to configure your date field to use the Date Range data type in the Search API index.

Constructing the queries

With the data in your index, the only thing left is to construct your queries to make use of the date range data-type.

In these examples, we'll assume your date-range field is called 'application_dates', that the field is multi-value and that the end date is optional.

We will use 1710201396000 as the current timestamp (in epoch microseconds).

Filtering to 'open'

This one is the simplest. You need a new term query that uses the current date

{   "from": 0,   "size": 10,   "query": {     "bool": {       "must": [         {           "term": {             "application_dates": 1710201396000           }         }       ]     }   } }

For a date-range field, the use of a term query will match any document where the given value is between the start and end of the range.

Filtering to 'closed'

This one is a bit more involved. We need to combine some conditions

{   "from": 0,   "size": 10,   "query": {     "bool": {       "minimum_should_match": 1,       "should": {         "bool": {           "minimum_should_match": 1,           "must_not": [             {               "range": {                 "application_dates": {                   "gte": 1710201396000                 }               }             },             {               "term": {                 "grant_is_ongoing": true               }             }           ],           "should": [             {               "range": {                 "application_dates": {                   "relation": "WITHIN",                   "lte": 1710201396000                 }               }             }           ]         }       }     }   } }

First, we have a should, which is an OR query in OpenSearch parlance. We're saying the grant must not have any future open dates (gte: 1710201396000) AND must not be flagged as ongoing. Then we're saying all of the application dates should be in the past (lte: 1710201396000).

Filtering to 'opening soon'

Here again, we need multiple conditions.

{   "from": 0,   "size": 10,   "query": {     "bool": {       "must": {         "range": {           "application_dates": {             "relation": "WITHIN",             "gte": 1710201396000           }         }       },       "must_not": [         {           "term": {             "application_dates": 1710201396000           }         },         {           "term": {             "grant_is_ongoing": true           }         }       ]     }   } }

First, we are saying there must be at least one date where the date range is in the future (gte: 1710201396000) and that the grant must not be ongoing OR have any date ranges that match the current date (term: 1710201396000) (i.e. are open).

Sorting

Sorting is a little more complex. We can make use of filters in our sort expressions.

To do this, we also need to index our date-range as an object so we can use nesting.

We duplicate the application_dates field to an application_dates_sort field and make this use the nested data-type. The mapping will look like this

{   "application_dates_sort": {     "type": "nested",     "properties": {       "gte": {         "type": "long"       },       "lte": {         "type": "long"       }     }   } }

Then, we can make use of this in our sort expressions. E.g. to have open items appear first, we can use

{   "from": 0,   "sort": [     {       "application_dates_sort.lte": {         "order": "asc",         "mode": "min",         "nested": {           "path": "application_dates_sort",           "filter": {             "bool": {               "must": [                 {                   "range": {                     "application_dates_sort.lte": {                       "gte": 1710201396000                     }                   }                 },                 {                   "range": {                     "application_dates_sort.gte": {                       "lte": 1710201396000                     }                   }                 }               ]             }           }         },         "missing": "_last"       }     }   ],   "size": 10 }

What we're doing here is sorting by the opening date, ascending, but using a filter to only match on records where the current date falls between the start/end date. We're telling OpenSearch to put any objects that don't match (missing) at the end.

We can stack sort expressions like this — subsequent expressions will apply to those that were 'missing' from the first sort. Combining these lets us achieve the desired sort order of opening -> opening soon -> closed.

Wrapping up

OpenSearch is pretty powerful. But to make the most of its features, you need to ensure your data is mapped into the right data-type. Using date-range fields for storing date-ranges takes a little bit of extra planning, but it leads to much more efficient queries and a better experience for your users.

Tagged OpenSearch
Categories: FLOSS Project Planets

Ruqola 2.2.0

Planet KDE - Tue, 2024-06-04 20:00

Ruqola 2.2.0 is a feature and bugfix release of the Rocket.chat app.

Improvements:

  • Allow to increase/decrease font (CTRL++/CTRL+-)
  • Add channel list style (Condensed/Medium/Extended)
  • Add forward message
  • Improve mentions support.
  • Add support for deep linking Deep Linking.
  • Implement block actions.
  • Implement personal token authentication. Bug 481400
  • Add Plasma Activities Support
  • Add Report User Support
  • Implement Channel Sound Notification.
  • Implement New Room Sound Notification.
  • Implement Sorted/Unsorted markdown list.

Some bug fixing:

  • Fix dark mode support.
  • Fix jitsi support.Fix translate message in direct channel.
  • Don't show @here/@all as user.
  • Reduce memory footprint.
  • Use RESTAPI for logging.
  • Allow to send multi files.
  • Fix preview url.

URL: https://download.kde.org/stable/ruqola/
Source: ruqola-2.2.0.tar.xz
SHA256: 4091126316ab0cd2d4a131facd3cd8fc8c659f348103b852db8b6d1fd4f164e2
Signed by: E0A3EB202F8E57528E13E72FD7574483BB57B18D Jonathan Esk-Riddell jr@jriddell.org
https://jriddell.org/esk-riddell.gpg

Categories: FLOSS Project Planets

Armin Ronacher: Your Node is Leaking Memory? setTimeout Could be the Reason

Planet Python - Tue, 2024-06-04 20:00

This is mostly an FYI for node developers. The issue being discussed in this post has caused us quite a bit of pain. It has to do with how node deals with timeouts. In short: you can very easily create memory leaks [1] with the setTimeout API in node. You're probably familiar with that API since it's one that browsers provide for many, many years. The API is pretty straightforward: you schedule a function to be called later, and you get a token back that can be used to clear the timeout later. In short:

const token = setTimeout(() => {}, 100); clearTimeout(token);

In the browser the token returned is just a number. If you do the same thing in node however, the token ends up being an actual Timeout object:

> setTimeout(() => {}) Timeout { _idleTimeout: 1, _idlePrev: [TimersList], _idleNext: [TimersList], _idleStart: 4312, _onTimeout: [Function (anonymous)], _timerArgs: undefined, _repeat: null, _destroyed: false, [Symbol(refed)]: true, [Symbol(kHasPrimitive)]: false, [Symbol(asyncId)]: 78, [Symbol(triggerId)]: 6 }

This “leaks” out some of the internals of how the timeout is implemented internally. For the last few years I think this has been just fine. Typically you used this object primarily as a token similar to how you would do that with a number. Might look something like this:

class MyThing { constructor() { this.timeout = setTimeout(() => { ... }, INTERVAL); } clearTimeout() { clearTimeout(this.timeout); } }

For the lifetime of MyThing, even after clearTimeout has been called or the timeout runs to completion, the object holds on to this timeout. While on completion or cancellation, the timeout is marked as “destroyed” in node terms and removed from it's internal tracking. What however happens is that this Timeout object is actually surviving until someone overrides or deletes the this.timeout reference. That's because the actual Timeout object is held and not just some token. This further means that the garbage collector won't actually collect this thing at all and everything that it references. This does not seem too bad as the Timeout seems somewhat hefty, but not too hefty. The most problematic part is most likely the _onTimeout member on it which might pull in a closure, but it's probably mostly okay in practice.

However the timeout object can act as a container for more state which is not quite as obvious. Annew API that has been added over the last couple of years called AsyncLocalStorage which is getting some traction is attaching additional state onto all timeouts that fire. Async locals storage is implemented in a way that timeouts (and promises and similar constructs) carry forward hidden state until they run:

const { AsyncLocalStorage } = require('node:async_hooks'); const als = new AsyncLocalStorage(); let t; als.run([...Array(10000)], () => { t = setTimeout(() => { // const theArray = als.getStore(); assert(theArray.length === 10000); }, 100); }); console.log(t);

When you run this, you will notice that the Timeout holds a reference to this large array:

Timeout { _idleTimeout: 100, _idlePrev: [TimersList], _idleNext: [TimersList], _idleStart: 10, _onTimeout: [Function (anonymous)], _timerArgs: undefined, _repeat: null, _destroyed: false, [Symbol(refed)]: true, [Symbol(kHasPrimitive)]: false, [Symbol(asyncId)]: 2, [Symbol(triggerId)]: 1, [Symbol(kResourceStore)]: [Array] // reference to that large array is held here }

That's because every single async local storage that is created registers itself with the timeout with a custom Symbol(kResourceStore) which even remains on there after a timeout has been cleared or the timeout ran to completion. This means that the more async local storage you use, the more “stuff” you hold on if you don't clear our the timeouts.

The fix seems obvious: rather than holding on to timeouts, hold on to the underlying ID. That's because you can convert a Timeout into a primitive (with for instance the unary + operator). The primitive is just a number like it would be in the browser which then can also be used for clearing. Since a number holds no reference, this should resolve the issue:

class MyThing { constructor() { // the + operator forces the timeout to be converted into a number this.timeout = +setTimeout(() => { ... }, INTERVAL); } clearTimeout() { // clearTimeout and other functions can resolve numbers back into // under internal timeout object clearTimeout(this.timeout); } }

Except it doesn't (today). In fact today doing this will cause an unrecoverable memory leak because of a bug in node [2]. Once that will be resolved however that should be a fine way to avoid problem.

Workaround for the leak with a Monkey-Patch

Since the bug is only triggered when a timer manages to run to completion, you could in theory forcefully clear the timeout or interval on completion if node “allocated” a primitive ID for it like so:

const kHasPrimitive = Reflect .ownKeys(setInterval(() => {})) .find((x) => x.toString() === 'Symbol(kHasPrimitive)'); function invokeSafe(t, callable) { try { return callable(); } finally { if (t[kHasPrimitive]) { clearTimeout(t); } } } const originalSetTimeout = global.setTimeout; global.setTimeout = (callable, ...rest) => { const t = originalSetTimeout(() => invokeSafe(t, callable), ...rest); return t; }; const originalSetInterval = global.setInterval; global.setInterval = (callable, ...rest) => { const t = originalSetInterval(() => invokeSafe(t, callable), ...rest); return t; };

This obviously makes a lot of assumptions about the internals of node, it will slow down every timer slightly created via setTimeout and setInterval but might help you in the interim if you do run into that bug.

Until then the second best thing you can do for now is to just be very aggressive in deleting these tokens manually the moment you no longer need them:

class MyThing { constructor() { this.timeout = setTimeout(() => { this.timeout = null; ... }, INTERVAL); } clearTimeout() { if (this.timeout) { clearTimeout(this.timeout); this.timeout = null; } } }

How problematic are timeouts? It's hard for me to say, but there are a lot of places where code holds on to timeouts and intervals in node for longer than is healthy. If you are trying to make things such as hot code reloading work, you are working with long lasting or recurring timeouts it might be very easy to run into this problem. Due to how widespread these timeouts are and the increased use of async local storage I can only assume that this will become a more common issue people run into. It's also a bit devious because you might not even know that you use async local storage as a user.

We're not the first to run into issues like this. For instance Next.js is trying to work around related issues by periodically patching setTimeout and setInterval to forcefully clearning out intervals to avoid memory leakage in the dev server. (Which unfortunately sometimes runs into the node bug mentioned above due to it's own use of toPrimitive)

How widespread is async local storage? It depends a bit on what you do. For instance we (and probably all players in the observability space including the OpenTelemetry project itself) use it to track tracing information with the local flow of execution. Modern JavaScript frameworks also sometimes are quite active users of async local storage. In the particular case we were debugging earlier today a total of 7 async local storages were attached to the timeouts we found in the heap dumps, some of which held on to massive react component trees.

Async local storage is great: I'm a huge proponent of it! If you have ever used Flask you will realize that Flask is built on a similar concept (thread locals, nowadays context vars) to give you access to the right request object. What however makes async local storage a bit scary is that it's very easy to hold on to memory accidentally. In node's case particularly easy with timeouts.

At the very least for timeouts in node there might be a simple improvement by no longer exposing the internal Timeout object. Node could in theory return a lightweight proxy object that breaks the cycle after the timeout has been executed or cleared. How backwards compatible this can be done I don't know however.

For improving async local storage longer term I think the ecosystem might have to embrace the idea about shedding contextual state. It's incredibly easy to leak async local storage today if you spawn "background" operations that last. For instance today a console.log will on first use allocate an internal TTY resource which accidentally holds on to the calling async local storage of completely unrelated stuff. Whenever a thing such as console.log wants to create a long lasting resource until the end of the program, helper APIs could be provided that automatically prevent all async local storage from propagating. Today there is only a way to prevent a specific local storage from propagating by disabling it, but that requires knowing which ones exist.

[1]Under normal circumstances these memory leaks would not be permanent leaks. They would resolve themselves when you finally drop a reference to that token. However due to a node bug it is currently possible for these leaks to be unrecoverable. [2]How we found that bug might be worth a story for another day.
Categories: FLOSS Project Planets

First Beta for Krita 5.2.3 Released

Planet KDE - Tue, 2024-06-04 20:00

We are releasing the first Beta of Krita 5.2.3. This release primarily brings a complete overhaul of our build system, making it so that our CI system can now build for all 4 platforms (a Continuous Integration system basically builds a program after every change, runs some tests and based on that helps us track down mistakes we made when changing Krita's code).

Beyond the rework of the build system, this release also has numerous fixes, particularly with regards to animated transform masks, jpeg-xl support, shortcut handling on Windows and painting assistants.

In addition to the core team, special thanks goes out to to Freya Lupen, Grum 999, Mathias Wein, Nabil Maghfur Usman, Alvin Wong, Deif Lou and Rasyuqa A. H. for various fixes, as well as the first time contributors in this release cycle (Each mentioned after their contribution).

  • Fix docker ShowOnWelcomePage flag (Bug 476683)
  • windows: Exclude meson in release package
  • Fix integer division rounding when scaling image (Patch by David C. Page, Thanks!)
  • Fix Comic Manager export with cropping TypeError
  • WebP: properly export color profile on RGBA model (Bug 477219)
  • WebP: properly assign profile when importing image with non-RGBA model (Bug 478307)
  • Fix saving animated webp
  • Added missing layer conversion options to layer docker rightclick menu (Patch by Ken Lo, thanks! MR 2039
  • Fix Last Documents Docker TypeError
  • Replace the Backers tab with a Sponsors tab
  • Comic Project Management Tools: Fix crash in comic exporter due incorrect api usage. (Bug 478668) Also turn off metadata retrieval for now.
  • Properly check color patch index for validity (Bug 480188)
  • Fix build with libjxl 0.9.0 (Bug 478987, patch by Timo Gurr, thanks!)
  • Included Krita's MLT plugin in our source tree, as well as several fixes to it.
  • Possibly fix ODR violation in the transform tool strategies (Bug 480520)
  • Fix framerate issue when adding audio to the image (Bug 481388)
  • Fix artifacts when drawing on a paused animation (Bug 481244)
  • Fix issues generating reference image in KisMergeLabeledLayersCommand (Bug 471896, Bug 472700)
  • Simplify similar color selection tool and add spread option
  • Fix artifacts when moving a layer under a blur adjustment layer (Bug 473853)
  • Make sure that tool-related shortcuts have priority over canvas actions (Bug 474416)
  • Refactor KisLodCapableLayerOffset into a reusable structure
  • Fix current time loading from a .kra file with animation
  • Clone scalar keyframes when creating a new one in the GUI
  • Refactor transform masks not to discard keyframe channels on every change (Bug 475385)
  • Fix artifacts when flattening animated transform mask (Bug 475334)
  • Fix extra in-stroke undo step when using Transform and Move tools (Bug 444791)
  • Fix rounding in animated transform masks with translation (Bug 456731)
  • Fix canvas inputs breaking due to inconsistent key states (BUG: 451424)
  • Make wave filter allowed in filter masks/layers (Bug 476033)
  • Fix an offset when opening files with animated transform masks (Bug 476317)
  • Fix transform mask offsets after running a transform tool on that (Bug 478966)
  • Fix creation of keyframes before the first frame for scalar channels (Bug 478448)
  • Support xsimd 12 (Bug 478986)
  • Fix new scalar frames to use interpolated values (Bug 479664)
  • Don't skip updates on colorize mask props change (Bug 475927)
  • Fix restoring layer's properties after new-layer-from-visible action (Bug 478225)
  • Fix transform masks being broken after duplication (Bug 475745)
  • Fix a crash when trying to open multiple files with one D&D (Bug 480331)
  • Fix cache being stuck when adding hold frames (Bug 481099)
  • Make PerspectiveAssistant respect snapToAny parameter
  • Fix logic in 2-Point Perspective Assistant
  • Remove unnecessary locking code from several assistants
  • Move a bunch of code repeated in derived classes to KisPaintingAssistant
  • Fix wg selector popup flickering with shortcut
  • Fix updating KoCanvasControllerWidget's canvas Y offset when resizing the canvas
  • Fix esthetics of the Welcome Page, remove banner.
  • Re-enable the workaround for plasma global menu (Bug 483170)
  • Improvements to support for Haiku OS.
  • Fix crash on KisMergeLabeledLayersCommand when using color labeled transform masks (Bug 480601)
  • PNG: prevent multiple color conversions (Bug 475737)
  • Fix crash when pasting font family in TextEditor (Bug 484066)
  • Fix grids glitches when subdivision style is solid (Bug 484889)
  • Add 1px padding to rects in KisPerspectiveTransformWorker to account for bilinear filtering (Bug 484677)
  • Fix activation of the node on opening .kra document (Bug 480718)
  • Various snapcraft fixes by Scarlet Moore
  • Fix sequence of imagepipe brushes when used in the line tool (Bug 436419)
  • tilt is no longer ignored in devices that support tilt (Bug 485709, patch by George Gianacopoulos, thanks!)
  • Make Ogg render consistently check for Theora
  • Fix issue with appbundle builder using libs.xml from arm64-v8a (Bug 485707)
  • Don't reset redo when doing color picking (Bug 485910)
  • Show and error when trying to import version 3 xcf files (Bug 485420)
  • Remove a bogus warning about Intel's openGL driver
  • Fix incorrect inclusion of "hard mix softer" blend mode in layer styles
  • Fixed issue on S24 ultra regarding S-pen settings when HiDPI is turned off (Patch by Spencer Sawyer, thanks!).
  • Revive code path in perspective ellipse and 2pp assistant draw routine
  • Don't try to regenerate frames on an image that has no animation
  • Fix data loss on reordering storyboards.
  • Improving the quality of icons for Android (Patch by Jesse 205, thanks! Bug 463043)
  • Enable SIP type stub file generation (Patch by Kate Corcoran, thanks!)
Download Windows

If you're using the portable zip files, just open the zip file in Explorer and drag the folder somewhere convenient, then double-click on the Krita icon in the folder. This will not impact an installed version of Krita, though it will share your settings and custom resources with your regular installed version of Krita. For reporting crashes, also get the debug symbols folder.

Note that we are not making 32 bits Windows builds anymore.

Linux

The separate gmic-qt AppImage is no longer needed.

(If, for some reason, Firefox thinks it needs to load this as text: to download, right-click on the link.)

macOS

Note: if you use macOS Sierra or High Sierra, please check this video to learn how to enable starting developer-signed binaries, instead of just Apple Store binaries.

Android

We consider Krita on ChromeOS as ready for production. Krita on Android is still beta. Krita is not available for Android phones, only for tablets, because the user interface requires a large screen.

We are no longer building the 32 bits Intel CPU APK, as we suspect it was only installed by accident and none of our users actually used it. We are hoping for feedback on this matter.

Source code md5sum

For all downloads, visit https://download.kde.org/unstable/krita/5.2.3-beta1/ and click on Details to get the hashes.

Key

The Linux AppImage and the source .tar.gz and .tar.xz tarballs are signed. You can retrieve the public key here. The signatures are here (filenames ending in .sig).

Categories: FLOSS Project Planets

Dirk Eddelbuettel: ulid 0.4.0 on CRAN: Extended to Milliseconds

Planet Debian - Tue, 2024-06-04 19:43

A new version of the ulid package is now on CRAN. The packages provides ‘universally (unique) lexicographically (sortable) identifiers’ – see the spec at GitHub for details on those – which offer sorting which uuids lack. The R package provides access via the standard C++ library, had been put together by Bob Rudis and is now maintained by me.

Mark Heckmann noticed that a ulid round trip of generating and unmarshalling swallowed subsecond informationm and posted on a well-known site I no longer go to. Duncan Murdoch was kind enough to open an issue to make me aware, and in it included the nice minimally complete verifiable example by Mark.

It turns out that this issue was known, documented upstream in two issues and fixed in fork by the authors of those issues, Chris Bove. It replaces time_t as the value of record (constrained at the second resolution) with a proper std::chrono object which offers milliseconds (and much more, yay Modern C++). So I switched the two main files of library to his, and updated the wrapper code to interface from POSIXct to std::chrono object. And with that we are in business. The original example of five ulids create 100 millisecond part, then unmarshalled and here printed as a data.table as data.frame by default truncates to seconds:

> library(ulid) > gen_ulid <- \(sleep) replicate(5, {Sys.sleep(sleep); generate()}) > u <- gen_ulid(.1) > df <- unmarshal(u) > data.table::data.table(df) ts rnd <POSc> <char> 1: 2024-05-30 16:38:28.588 CSQAJBPNX75R0G5A 2: 2024-05-30 16:38:28.688 XZX0TREDHD6PC1YR 3: 2024-05-30 16:38:28.789 0YK9GKZVTED27QMK 4: 2024-05-30 16:38:28.890 SC3M3G6KGPH7S50S 5: 2024-05-30 16:38:28.990 TSKCBWJ3TEKCPBY0 >

We updated the documentation accordingly, and added some new tests as well. The NEWS entry for this release follows.

Changes in version 0.4.0 (2024-06-03)
  • Switch two functions to fork by Chris Bove using std::chrono instead of time_t for consistent millisecond resolution (#3 fixing #2)

  • Updated documentation showing consistent millisecond functionality

  • Added unit tests for millisecond functionality

Courtesy of my CRANberries, there is also a diffstat report for this release. If you like this or other open-source work I do, you can sponsor me at GitHub.

This post by Dirk Eddelbuettel originated on his Thinking inside the box blog. Please report excessive re-aggregation in third-party for-profit settings.

Categories: FLOSS Project Planets

TestDriven.io: Storing Django Static and Media Files on DigitalOcean Spaces

Planet Python - Tue, 2024-06-04 18:28
This tutorial shows how to configure Django to load and serve up static and media files, public and private, via DigitalOcean Spaces.
Categories: FLOSS Project Planets

FSF Events: Free Software Directory meeting on IRC: Friday, June 07, starting at 12:00 EDT (16:00 UTC)

GNU Planet! - Tue, 2024-06-04 16:28
Join the FSF and friends on Friday, June 07, from 12:00 to 15:00 EDT (16:00 to 19:00 UTC) to help improve the Free Software Directory.
Categories: FLOSS Project Planets

PyCoder’s Weekly: Issue #632 (June 4, 2024)

Planet Python - Tue, 2024-06-04 15:30

#632 – JUNE 4, 2024
View in Browser »

What’s a Python Hashable Object?

You can ignore reading about hashable objects for quite a bit. But eventually, it’s worth having an idea of what they are. This post follows Winston on his first day at work to understand hashable objects
THEPYTHONCODINGSTACK.COM • Shared by Stephen Gruppetta

PyCon US 2024 Recap

Part travelogue, part conference diary, Katherine writes about her experience at PyCon US 2024 held in Pittsburgh, PA.
KATHERINE MICHEL

[Video] Saga Pattern Simplified: Building Sagas with Temporal

In this on-demand webinar, we give an overview of Sagas, including challenges and benefits. Then, we introduce Temporal and demonstrate how easy it is to build, test, and run Sagas using our platform and coding in your preferred language. No prior knowledge of Temporal is needed to follow along →
TEMPORAL sponsor

One Way to Fix Python Circular Imports

Python circular imports can be confusing. Simply using a different form of import can sometimes fix the problem.
NED BATCHELDER

Guido Withdraws Ownership From Core Interpreter

Associated HN Discussion
GITHUB.COM/PYTHON

Django Day Copenhagen 2024 Call for Proposals

DJANGO DAY

Discussions What Would You Spend Time on if You Didn’t Need Money?

HACKER NEWS

Articles & Tutorials Three Laws of Software Complexity

Mahesh states that most software engineers (particularly those working on infrastructural systems) are destined to wallow in unnecessary complexity due to three fundamental laws. Associated HN Discussion.
MAHESH BALAKRISHNAN

Efficient Iterations With Python Iterators and Iterables

In this video course, you’ll learn what iterators and iterables are in Python. You’ll learn how they differ and when to use them in your code. You’ll also learn how to create your own iterators and iterables to make data processing more efficient.
REAL PYTHON course

Authentication Your Whole Team Will Love

“With PropelAuth, I think I’ve spent about a day – total – on auth over the past year.” PropelAuth is easy to integrate and provides all the tools your team needs to manage your users - dashboards, user insights, impersonation, SSO and more →
PROPELAUTH sponsor

My BDFL Guiding Principles

Daniel is the Benevolent Dictator For Life of the curl library, and in this post he considers how the “dictator” part may not be the best way to run an open source project. He covers 10 principles he tries to embody when guiding the curl project.
DANIEL STENBERG

Writing Fast String ufuncs for NumPy 2.0

NumPy 2.0 is coming shortly and has loads of big changes and new features. This article talks about NumPy’s universal functions which can be applied to the elements of a NumPy array, and how to write good ones when dealing with strings.
LYSANDROS NIKOLAOU

How Python Compares Floats and Integers

Floating point isn’t a precise representation, this can mean that when you compare a float with an int, something that looks like it should be the same, might not be. This article deep dives on how that works in Python.
ABHINAV UPADHYAY

Thinking About Running for the PSF Board of Directors?

The Python Software Foundation runs office hours and this year they’ve set aside two times for asking questions of current board members. If you’re thinking of running, this is a perfect place to get more information.
PYTHON SOFTWARE FOUNDATION

How EuroPython Proposals Are Selected

This post from the folks at EuroPython walks you through how their conference selection process works. Although EuroPython specific, it contains lots of good general advice for getting accepted at any conference.
CRISTIÁN MAUREIRA-FREDES & SANGARSHANAN

Django Enhancement Proposal 14: Background Workers

A DEP is like a PEP, but for Django. There are many libraries out there for dealing with background tasks when writing Django code, DEP 14 proposes an interface for such systems as well as a default backend.
DJANGO SOFTWARE FOUNDATION

How to Create Pivot Tables With pandas

In this tutorial, you’ll learn how to create pivot tables using pandas. You’ll explore the key features of DataFrame’s pivot_table() method and practice using them to aggregate your data in different ways.
REAL PYTHON

Essays on Programming I Think About a Lot

A collection of essays on software from a variety of sources. Content includes how to choose your tech stack, products, abstractions, and more.
BEN KUHN

Projects & Code django-auditlog: Log of Changes to an Object

GITHUB.COM/JAZZBAND

Python Resources for Working With Excel

PYTHON-EXCEL.ORG

Pytest Plugin to Disable Tests in One or More Files

GITHUB.COM/POMPONCHIK • Shared by Evgeniy Blinov (pomponchik)

UXsim: Vehicular Traffic Flow Simulator in Road Network

GITHUB.COM/TORUSEO

Python Finite State Machines Made Easy

GITHUB.COM/FGMACEDO • Shared by Fernando Macedo

Events Weekly Real Python Office Hours Q&A (Virtual)

June 5, 2024
REALPYTHON.COM

DjangoCon Europe 2024

June 5 to June 10, 2024
DJANGOCON.EU

Canberra Python Meetup

June 6, 2024
MEETUP.COM

PyCon Colombia 2024

June 7 to June 10, 2024
PYCON.CO

IndyPy Hosts “AI Powered by Python Panel”

June 11, 2024
MEETUP.COM • Shared by Laura Stephens

Wagtail Space NL

June 12 to June 15, 2024
WAGTAIL.SPACE

Happy Pythoning!
This was PyCoder’s Weekly Issue #632.
View in Browser »

[ Subscribe to 🐍 PyCoder’s Weekly 💌 – Get the best Python news, articles, and tutorials delivered to your inbox once a week >> Click here to learn more ]

Categories: FLOSS Project Planets

Brian Perry: Matching Drupal’s GitLab CI ESLint Configuration in a Contrib Module

Planet Drupal - Tue, 2024-06-04 12:04

The Drupal Association now maintains a GitLab CI Template that can be used for all Drupal contrib projects. It's an excellent way to quickly take advantage of Drupal.org's CI system and ensure your project is following code standards and best practices. And using it has the bonus of giving you a sweet green checkmark on your project page!

We recently added this template to the Same Page Preview module. After doing so, our JavaScript linting was failing. This wasn't surprising since we hadn't yet committed a standard ESLint or Prettier configuration to the codebase. I took a shot at trying to resolve these linting issues, initially turning to the ESLint Drupal Contrib plugin. This allowed me to get ESLint up and running quickly and run linting with only within the context of this module. I resolved all of the linting issues, pushed my work up to GitLab, and started thinking about how I'd reward myself for a job well done.

Disaster Strikes

And as you might expect, the CI build still failed. 🤦‍♂️

At this point I took a step back. First off, I needed to determine what differed between my ESLint process and the one that was being executed by the Drupal Gitlab CI Template. Secondly, beyond just getting the CI job to pass, I wanted to define the linting use cases I was trying to solve for. I decided to focus on the following:

  1. Determining how to run the exact same ESLint command that the GitLab CI Template was running, using the same configuration as Drupal Core.
  2. Developing an ESLint configuration that could be run within the standalone module codebase (with or without an existing instance of Drupal) but matching Drupal Core and GitLab CI's configuration as closely as possible.
Using the Drupal Core ESLint Configuration

Here we literally want to use the same ESLint binary and config used by Drupal Core. Since this is what Drupal's GitLab CI Template is doing, this is also an opportunity to match the CI linting configuration as closely as possible.

The CI job is running the following command:

$ $CI_PROJECT_DIR/$_WEB_ROOT/core/node_modules/.bin/eslint \ --no-error-on-unmatched-pattern --ignore-pattern="*.es6.js" \ --resolve-plugins-relative-to=$CI_PROJECT_DIR/$_WEB_ROOT/core \ --ext=.js,.yml \ --format=junit \ --output-file=$CI_PROJECT_DIR/junit.xml \ $_ESLINT_EXTRA . || EXIT_CODE_FILE=$?

And prior to that command, symlinks are also created for some relevant configuration files:

$ cd $CI_PROJECT_DIR/$_WEB_ROOT/modules/custom/$CI_PROJECT_NAME $ ln -s $CI_PROJECT_DIR/$_WEB_ROOT/core/.eslintrc.passing.json $CI_PROJECT_DIR/$_WEB_ROOT/modules/custom/.eslintrc.json $ ln -s $CI_PROJECT_DIR/$_WEB_ROOT/core/.eslintrc.jquery.json $CI_PROJECT_DIR/$_WEB_ROOT/modules/custom/.eslintrc.jquery.json $ test -e .prettierrc.json || ln -s $CI_PROJECT_DIR/$_WEB_ROOT/core/.prettierrc.json . $ test -e .prettierignore || echo '*.yml' > .prettierignore

This means that we'll need to run eslint using Core's 'passing' configuration (which itself extends the 'jquery' configuration.)

To match that, I created an eslint:core script in the module's package.json:

{ "scripts": { "eslint:core": "../../../core/node_modules/.bin/eslint . \ --no-error-on-unmatched-pattern \ --ignore-pattern='*.es6.js' \ --resolve-plugins-relative-to=../../../core \ --ext=.js,.yml \ -c ../../../core/.eslintrc.passing.json" } }

I was surprised to find that even after running this command locally, the CI job was still failing. It turned out that ESLint wasn't using Core's Prettier config in this case, resulting in a different set of formatting rules being applied. Copying core/.prettierrc.json into the module's root directory resolved this issue.

Copying Drupal Core's prettier config wholesale isn't great. The approaches to referencing and extending a prettier config are clunky, but possible. A more ideal solution would be to have Drupal's prettier config as a package that could be referenced by both core and contrib modules.

Using a Standalone ESLint Configuration

Ideally it would also be possible to run this linting outside of the context of a full Drupal instance. This could help speed up things like pre-commit hooks, some CI tasks, and also make quick linting checks easier to run. With the lessons from using Drupal Core's ESLint configuration fresh in mind, I took another shot at using the eslint-plugin-drupal-contrib plugin.

First, I installed it in the module as a dev dependency:

npm i -D eslint-plugin-drupal-contrib

Next, I created a file .eslintrc.contrib.json in the module's root directory:

{ "extends": ["plugin:drupal-contrib/passing"] }

This will result in eslint using the same configuration as Drupal Core's 'passing' configuration, but without needing to reference Core's configuration files. Finally, you can run this by adding the following eslint script in the module's package.json:

{ "scripts": { "eslint": "eslint . \ -c .eslintrc.contrib.json \ --no-eslintrc" } }

You might be surprised to see the --no-eslintrc flag above. That prevents ESLint from looking for any other configuration files in the directory tree. Without it, ESLint will find the Drupal Core configuration files if this happens to be run from within a Drupal project. This will result in ESLint attempting to resolve plugins using Drupal Core's node_modules directory, which may or may not exist.

Also note that ESLint 8.x uses the --no-eslintrc flag, while the ESLint 9.x equivalent is --no-config-lookup. Drupal core is currently on ESLint 8.x, which is the previous major release.

Happy Linting

I ran into a few more hiccups than I expected along the way, but now feel confident that I can have consistent linting results between my local environment and the Drupal.org CI system in all of the JavaScript code I write for contrib modules. Hopefully this can help you do the same.

Resources
Categories: FLOSS Project Planets

The Drop Times: Insights from DrupalJam Speakers: Integration, Experience, and Advocacy

Planet Drupal - Tue, 2024-06-04 10:15
Dive into the heart of DrupalJam 2024 with exclusive insights from notable speakers! Discover the invaluable perspectives shared by Frederik Wouters, Mikko Hämäläinen, and Mathias Bolt Lesniak as they shed light on integrating RAG and OpenAI technologies, exploring Digital Experience Platforms (DXP), and advocating for open-source software. With sessions catering to all levels of expertise, DrupalJam promises practical knowledge and community engagement for every attendee. Stay tuned as we unravel each speaker's session, providing a deeper understanding of Drupal and open-source technologies at this premier event.
Categories: FLOSS Project Planets

Real Python: Python Interfaces: Object-Oriented Design Principles

Planet Python - Tue, 2024-06-04 10:00

Interfaces play an important role in software engineering. As an application grows, updates and changes to the code base become more difficult to manage. More often than not, you wind up having classes that look very similar but are unrelated, which can lead to some confusion. In this video course, you’ll see how you can use a Python interface to help determine what class you should use to tackle the current problem.

In this video course, you’ll be able to:

  • Understand how interfaces work and the caveats of Python interface creation
  • Comprehend how useful interfaces are in a dynamic language like Python
  • Implement an informal Python interface
  • Use abc.ABCMeta and @abc.abstractmethod to implement a formal Python interface

[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]

Categories: FLOSS Project Planets

Jonathan Dowland: Quake (soundtrack)

Planet Debian - Tue, 2024-06-04 06:17

I haven't done that much crate digging recently, but I did stick this on last week: Trent Reznor's soundtrack for Quake, originally released (within the game) in 1996, and finally issued for the first time independently in 2020.

Quake LP cover and inner covers

I picked it up the Nine Inch Nails gig in Cornwall, 2022.

An interesting factoid about the original release was the CD was mastered with the little-known pre-emphasis flag set to "on". This was relatively unusual at the time (1996) that it was never clear whether it was deliberate or not. CD ripping back then usually used an analog audio path from the CD-ROM drive to the PC sound card, and the CD-ROM would apply the necessary pre-emphasis. Therefore, ripping software didn't need to deal with it, and so most of it (then and now) doesn't, even though the path had long since changed to a purely-digital extraction. Thus, the various copies of the soundtrack circulating may or may not have had pre-emphasis correction applied, and if they did, it may or may not have been required to hear the soundtrack as it was intended.

I spent a bit of time a few years ago, before the reissue, trying to determine what was "correct". There is certainly an audible difference with pre-emphasis applied (or not), but it wasn't clear which was the intended experience. The reissue should have cleared this up once and for all, but I haven't gone back to check what the outcome was.

Categories: FLOSS Project Planets

Contributions of Open Source to AI: a panel discussion at CPDP-ai conference

Open Source Initiative - Tue, 2024-06-04 05:00

I participated as a panelist at the CPDP-ai 2024 conference in Brussels last week where we discussed the significant contributions of Open Source to AI and highlighted the specific properties that differentiate Open Source AI from proprietary solutions. Representing the Open Source Initiative (OSI), the globally recognized non-profit that defines the term Open Source, I emphasized the longstanding principle of granting users full agency and control over technology, which has been proven to deliver extensive social benefits.

Below is a glimpse at the questions and answers posed to me and my fellow panelists:

Question: Stefano, please explain what the contribution to AI from Open Source is, and if there are specific properties of Open Source AI that make a difference for the users and for the people who are confronted with its results.

Response: The Definition of Open Source Software has existed for over 25 years; That doesn’t apply to AI. The Open Source Definition for software provides a stable north star for all participants in the digital ecosystem, from small and large companies to citizens and governments.

The basic principle of the Open Source Definition is to grant to the users of any technology full agency and control over the technology itself. This means that users of Open Source technologies have self-sovereignty of the technical solutions.

The Open Source Definition has demonstrated that massive social benefits accrue when you remove the barriers to learning, using, sharing and improving software systems. There is ample evidence that giving users agency, control and self-sovereignty of their technical choices produces a viable ecosystem based on permissionless innovation. Multiple studies by the EU Commission and Harvard researchers have assigned significant economic value to Open Source Software, all based on that single, clear, understood and approved Definition from 26 years ago.

For AI, and especially the most recent machine learning solutions, it’s less clear how society can maintain self-sovereignty of the technology and how to achieve permissionless innovation. Despite the fact that many people talk about Open Source AI, including the AI Act, there is no shared understanding of what that means, yet!

The Open Source Initiative is concluding a global, multi-stakeholder co-design process to find an unequivocal definition of Open Source AI, and we’re heading towards the conclusion of this process with a vastly increased knowledge of the AI machine learning space. The current draft of the Open Source AI Definition recognizes that in order to study, use, share and modify AI, one needs to refer to an AI system, not a single individual component. The global process has identified the components required for society to maintain control of the technology and these are: 

  • Detailed information about the dataset used to train the system and the code so that a skilled person can train a system with similar capabilities
  • All the libraries and tools used to run training and inference
  • The model architecture and the parameters, like weights and biases

Having unrestricted access to all these elements is what makes an AI an Open Source AI.

We’re in the final stretch of the process, starting to gather support for the current draft of the definition.

The most controversial part of the discussion is the role of data in the training. To answer your question about the power of big foreign tech companies, putting aside the hardware requirements, the data is where the fight is. There seem to be two views of the world on data when it comes to AI: One thinks that text and data mining is basically strip mining humanity and all accumulation of data without consent of the rights holders must be made illegal. Another view of the world is that text and data mining for the purpose of training Open Source AI is probably the only antidote to the superpowers of large corporations. These camps haven’t found a common position yet. Japan seems to have made up its mind already, legalizing unrestricted text and data mining. We’ll see where the lawsuits in the US will go, if they ever get to a decision in court or, as I suspect, they will be settled out of court. 

In any case, data, competence and to some extent hardware, are the levers to control the development of AI. 

Open Source has been leveling the playing field of technologies. We know from past experience with Open Source software that giving people unrestricted access to the means of digital production enables tremendous economic value. This worked in Europe as well as in China. We think that Open Source AI can have the same effect of generating value while leaving control of the technology in the hands of society.

Question: Big tech companies are important for the development of AI. Apart from the purely technological impacts, there is also economic importance. The European Commission has been very concerned about the Digital Single Market recently, and has initiated legislation such as DSA and DMA to improve competition and market access. Will these instruments be sufficient in view of AI roll-out, thinking also of the recently adopted AI Act? Or will additional attention need to be paid?

Response: Open is the best antidote to the concentration of power. That said, I see these legislations as the sticks, very necessary. I’d love us to think also about carrots. We don’t want to repeat the mistakes of the past with the early years of the internet. Open Source software was equally available in the US and Europe but despite that, the few European champions of Open Source haven’t grown big enough to have a global impact. And some of the biggest EU companies aren’t exactly friendly with Open Source either. 

Chinese companies have taken a different approach. But in Europe we have talents, and we have an attractive quality of life so we can get even more talents. Finding money is never an issue. We need to remove the disincentives to grow our companies bigger, widen the access to the internal EU market and support their international expansion, too.

For example, we need to review European Regulation 1025, on standardization to accommodate for Open Source. 1025 Regulation was written at a time when Open Source was considered a “business model” and information and communication technology standards were about voltages in a wire. Today, Open Source is between 80% and 90% of all software and “digital elements” comprise some part of every modern product. Even hardware solutions are dominated by “digital elements.” As such, the approach taken by 1025 is out of date and most likely needs a root-and-branch rethink to properly apply to the world today and the world we anticipate tomorrow.

We need to make sure that the standardization rules required by the Cyber Resilience Act are written together with Open Source champions so the rules don’t favor exclusively the cartel of European patent holders who try to seek rent instead of innovating. Europe has all the means to be at the center of AI innovation; It embodies the right values of diversity and collaboration. 

Closing remarks: We think that Open Source is the best antidote to fight market concentration in AI. Data is where the concentration of power is happening now and it’s in the hands of massive corporations: not only Google, Meta, Amazon, Reddit but also Sony, Warner, Netflix, Getty Images, Adobe … All these companies have already gained access to massive amounts of data, legally. These companies basically own our data, legally: Our pictures, the graph of our circles of friends, all the books and movies… 

There is a risk that if we don’t write policies that allow text and data mining in exchange of a real Open Source AI (one that society can fully control) then we risk leaving the most powerful AI systems in the hands of the oligopoly who can afford trading money for access to data.

Categories: FLOSS Research

Specbee: A quick guide to integrating CiviCRM with Drupal

Planet Drupal - Tue, 2024-06-04 04:02
Keeping up with your customer’s evolving expectations is not easy - especially when you want to deliver a memorable customer experience. But by leveraging the power of a CRM (Customer Relationship Management) tool, you can rest assured that you're equipped to meet those expectations head-on. A CRM doesn’t just help you manage customer data; it empowers you to understand and anticipate their needs, creating a proactive rather than reactive approach to customer service. Now you know Drupal is great with integrations. You can seamlessly integrate Drupal with almost any third-party tool. Combining a CMS like Drupal with a CRM like CiviCRM can revolutionize the way your organization manages data, engagement, and operations. CiviCRM is an open-source web-based CRM tool that caters to the needs of non-profits and other civic-sector organizations. In this article, let’s learn how to integrate CiviCRM with Drupal 10. What you need You only need two prerequisites before commencing your CiviCRM and Drupal 10 integration: cv tool Composer I assume that you already have a pre-installed Drupal site using Composer, so let’s not dig into installing Composer. Installing the cv tool Before installing the cv tool, it's important to understand what it is. Similar to Drush or Drupal Console, cv is a command line tool for installing and managing CiviCRM. This tool locates and boots the CiviCRM installation. Below are the two commands to install the cv tool on your system: sudo curl -LsS https://download.civicrm.org/cv/cv.phar -o /usr/local/bin/cv sudo chmod +x /usr/local/bin/cv Download CiviCRM packages We will use composer to download CiviCRM packages: composer require civicrm/civicrm-{core,packages,drupal-8} composer require civicrm/cli-toolsDownload translations for CiviCRM (optional) This step is optional and can be used if you have a multilingual site or need translations for CiviCRM. mkdir -p web/sites/default/files/civicrm/l10n/fr_FR/LC_MESSAGES curl -Lss -o web/sites/default/files/civicrm/l10n/fr_FR/LC_MESSAGES/civicrm.mo https://download.civicrm.org/civicrm-l10n-core/mo/en_EN/civicrm.mo export CIVICRM_L10N_BASEDIR=/var/drupalsites/techx/web/sites/default/files/civicrm/l10nInstall CiviCRM To install CiviCRM, now let’s use the cv CLI tool cv core:install --cms-base-url="[WEBSITE URL]" --lang="en_EN"Replace the [WEBSITE URL] token with your website’s base URL.   Once CiviCRM is installed, you can visit your website’s CiviCRM page at [WEBSITE URL]/civicrm. Note that CiviCRM will be installed on your existing Drupal database. While it is possible to install CiviCRM on a separate database, we will not cover that in this article (part 2 maybe? Subscribe to our newsletter so you don’t miss any of our latest articles!). explore more about CiviCRM and Drupal integration, visit their demo page.   Select your preferred language, choose Drupal 10 in the CMS option, and click on the "Try Demo" button. Final Thoughts Drupal CiviCRM integration is a powerful tool for organizations looking to enhance their website functionality and streamline relationship management processes. By combining the strengths of Drupal and CiviCRM, organizations can create a cohesive online presence that effectively engages constituents and supports their mission. Implementing best practices and following a systematic approach can lead to a successful integration that drives growth and success for the organization. Talk to our Drupal experts today to integrate your Drupal website with CiviCRM or any other CRM tool!
Categories: FLOSS Project Planets

Python Bytes: #386 Major releases abound

Planet Python - Tue, 2024-06-04 04:00
<strong>Topics covered in this episode:</strong><br> <ul> <li><a href="https://numpy.org/news/#numpy-20-release-date-june-16"><strong>NumPy 2.0 release date is June 16</strong></a></li> <li><a href="https://github.com/encode/uvicorn/pull/2183#issuecomment-1976399675">Uvicorn adds multiprocess workers</a></li> <li><a href="https://github.com/prefix-dev/pixi"><strong>pixi</strong></a></li> <li><a href="https://blog.jupyter.org/jupyterlab-4-2-and-notebook-7-2-are-available-b5e6b3c753de">JupyterLab 4.2 and Notebook 7.2</a> are available</li> <li><strong>Extras</strong></li> <li><strong>Joke</strong></li> </ul><a href='https://www.youtube.com/watch?v=BybkexjbTMY' style='font-weight: bold;'data-umami-event="Livestream-Past" data-umami-event-episode="386">Watch on YouTube</a><br> <p><strong>About the show</strong></p> <p>Sponsored by Mailtrap: <a href="https://pythonbytes.fm/mailtrap"><strong>pythonbytes.fm/mailtrap</strong></a></p> <p><strong>Connect with the hosts</strong></p> <ul> <li>Michael: <a href="https://fosstodon.org/@mkennedy"><strong>@mkennedy@fosstodon.org</strong></a></li> <li>Brian: <a href="https://fosstodon.org/@brianokken"><strong>@brianokken@fosstodon.org</strong></a></li> <li>Show: <a href="https://fosstodon.org/@pythonbytes"><strong>@pythonbytes@fosstodon.org</strong></a></li> </ul> <p>Join us on YouTube at <a href="https://pythonbytes.fm/stream/live"><strong>pythonbytes.fm/live</strong></a> to be part of the audience. Usually Tuesdays at 10am PT. Older video versions available there too.</p> <p>Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to <a href="https://pythonbytes.fm/friends-of-the-show">our friends of the show list</a>, we'll never share it. </p> <p><strong>Brian #1:</strong> <a href="https://numpy.org/news/#numpy-20-release-date-june-16"><strong>NumPy 2.0 release date is June 16</strong></a></p> <ul> <li>“This release has been over a year in the making, and is the first major release since 2006. Importantly, in addition to many new features and performance improvement, it contains <strong>breaking changes</strong> to the ABI as well as the Python and C APIs. It is likely that downstream packages and end user code needs to be adapted - if you can, please verify whether your code works with NumPy 2.0.0rc2.”</li> <li><a href="https://numpy.org/devdocs/release/2.0.0-notes.html">NumPy 2.0.0 Release Notes</a></li> <li><a href="https://numpy.org/devdocs/numpy_2_0_migration_guide.html">NumPy 2.0 migration guide</a> <ul> <li>including “try just running ruff check path/to/code/ --select NPY201”</li> <li>“Many of the changes covered in the 2.0 release notes and in this migration guide can be automatically adapted in downstream code with a dedicated Ruff rule, namely rule NPY201.”</li> </ul></li> </ul> <p><strong>Michael #2:</strong> <a href="https://github.com/encode/uvicorn/pull/2183#issuecomment-1976399675">Uvicorn adds multiprocess workers</a></p> <ul> <li>via John Hagen</li> <li>The goal was to no longer need to suggest that people use Gunicorn on top of uvicorn. Uvicorn can now in a sense "do it all”</li> <li><a href="https://www.uvicorn.org/deployment/#built-in">Steps to use it and background on how it works</a>.</li> </ul> <p><strong>Brian #3:</strong> <a href="https://github.com/prefix-dev/pixi"><strong>pixi</strong></a></p> <ul> <li>Suggested by Vic Kelson</li> <li>“pixi is a cross-platform, multi-language package manager and workflow tool built on the foundation of the conda ecosystem.”</li> <li><a href="https://pixi.sh/latest/tutorials/python/">Tutorial: Doing Python development with Pixi</a></li> <li>Some quotes from Vic: <ul> <li>“Pixi is a project manager, written in Rust, that allows you to build Python projects without having Python previously installed. It’s installable with Homebrew (brew install pixi on Linux and MacOS). There’s support in VSCode and PyCharm via plugins. By default, pixi fetches packages from conda-forge, so you get the scientific stack in a pretty reliable and performant build. If a package isn’t on conda-forge, it’ll look on PyPI, or I believe you can force it to look on PyPI if you like.”</li> <li>“So far, it works GREAT for me. What really impressed me is that I got a Jupyter environment with CuPy utilizing my aging Nvidia GPU on the FIRST TRY.”</li> </ul></li> </ul> <p><strong>Michael #4:</strong> <a href="https://blog.jupyter.org/jupyterlab-4-2-and-notebook-7-2-are-available-b5e6b3c753de">JupyterLab 4.2 and Notebook 7.2</a> are available</p> <ul> <li><a href="https://github.com/jupyterlab/jupyterlab">JupyterLab</a> 4.2.0 has been released! This new minor release of JupyterLab includes 3 new features, 20 enhancements, 33 bug fixes and 29 maintenance tasks.</li> <li><a href="https://github.com/jupyter/notebook">Jupyter Notebook</a> 7.2.0 has also been released</li> <li>Highlights include <ul> <li>Easier Workspaces Management with GUI</li> <li>Recently opened/closed files</li> <li>Full notebook windowing mode by default (renders only the cells visible in the window, leading to improved performance)</li> <li>Improved Shortcuts Editor</li> <li>Dark High Contrast Theme</li> </ul></li> </ul> <p><strong>Extras</strong> </p> <p>Brian:</p> <ul> <li><a href="https://dev.to/hugovk/help-test-python-313-14j1">Help test Python 3.13!</a></li> <li><a href="https://dev.to/hugovk/help-us-test-free-threaded-python-without-the-gil-1hgf">Help us test free-threaded Python without the GIL</a> <ul> <li>both from Hugo van Kemenade</li> </ul></li> <li><a href="https://podcast.pythontest.com/episodes/221-how-to-get-pytest-to-import-your-code-under-test">Python Test 221: How to get pytest to import your code under test is out</a></li> </ul> <p>Michael:</p> <ul> <li><a href="https://github.com/HigherOrderCO/bend">Bend</a> follow up from Bernát Gábor <ul> <li>“Bend looks roughly like Python but is nowhere there actually. For example it has no for loops, instead you're meant to use bend keyword (hence the language name) to expand calculations and another keyword to join branches. So basically think of something that resembles Python at high level, but without being compatible with that and without any of the standard library or packages the Python language provides. That being said does an impressive job at parallelization, but essentially it's a brand new language with new syntax and paradigms that you will have to learn, it just shares at first look similarities with Python the most.”</li> </ul></li> </ul> <p><strong>Joke:</strong> <a href="https://www.reddit.com/r/programminghumor/comments/1cyk9ol/finally/?share_id=lDWIfCK2z_XcX91gJsuQ7&utm_content=2&utm_medium=ios_app&utm_name=iossmf&utm_source=share&utm_term=22">Do-while</a></p>
Categories: FLOSS Project Planets

joshics.in: Drupal 10: Not Just Another Upgrade, But a Revolution in Web Development

Planet Drupal - Tue, 2024-06-04 01:12
Drupal 10: Not Just Another Upgrade, But a Revolution in Web Development bhavinhjoshi Tue, 06/04/2024 - 10:42

Ever had to scratch your head, ferreting out the secret weapon behind those mammoth success stories of the digital world? Well, ponder no more, you've just stumbled upon the world of Drupal 10!

As the latest prodigy of the Drupal family, Drupal 10 isn't just a facelift but a tool that champions uncharted territories. With an arsenal of new features, it's here to empower developers - both seasoned and novice ones.

Let's picture this through an everyday scenario. You're a site developer catering to a global audience. Earlier, you'd have to have multiple versions of the site to cater to various languages. Drupal 10 simplifies this with its built-in multilingual feature. You can now translate and localise your website without bothersome plugins.

Consider another common tussle developers face - keeping up with the digital world's insatiable appetite for interactive and responsive designs. With Drupal 10's layout builder, you can create and customize layouts per content type, per node, or even per user role.

Even SEO, the lifeline of online visibility, gets a boost with Drupal's 10 in-built meta tag and path auto modules. Now you can ensure your site stays in the limelight with less effort and more smart work.

Yet the standout trait of Drupal 10, one that deserves a tip of the hat, is the ease of upgrade. Remember the hair-pulling days of upgrading from Drupal 7 to 8? Those days are in the rear view mirror as Drupal 10 ensures a seamless upgrade, promising the preservation of custom themes and modules.

In conclusion, Drupal 10 is not just a new version, but a pivotal shift in web development that caters to today's fast-paced, ever-evolving digital landscape. It's about time we embrace it!

Drupal 10 Drupal Planet
Categories: FLOSS Project Planets

Haruna 1.1.2

Planet KDE - Mon, 2024-06-03 23:00

Haruna version 1.1.2 is out.

You can get it now on flathub:

Availability of other package formats depends on your distro and the people who package Haruna.

Windows version:

If you like Haruna then support its development: GitHub Sponsors | Liberapay | PayPal

Feature requests and bugs should be posted on bugs.kde.org, but for bugs make sure to fill in the template and provide as much information as possible.

Changelog: 1.1.2

Bugfixes:

  • Disabled track selection menus and buttons when there are no tracks to be selected
  • Fixed custom command toggling
  • Re-added "Scroll to playing item" and "Trash file" options to playlist context menu, lost during Qt6 port
  • Fixed some mpv properties not being correctly set at startup
  • Fixed video rendering on Windows
Categories: FLOSS Project Planets

Plasma 6 and 'traditional' window tiling

Planet KDE - Mon, 2024-06-03 20:00

I was keeping myself on Plasma 5.x until recently. I got so accustomed to the Bismuth window tiling script for KWin that I couldn’t imagine myself updating to Plasma 6.x where Bismuth doesn’t work.

Unfortunately (?), one of the recent Debian updates broke Bismuth in Plasma 5.x as well, so I had nothing keeping me on the old version anymore. I’m now (again) running the development version of (most) KDE software.

Since the update, I managed to make the Qtile tiling window manager work with Plasma to some extent. But the integration between Qtile and Plasma I hacked was less than ideal, and I kept switching between KWin which worked perfectly, as KWin does, but without tiling, and my Frankenstein Qtile which didn’t work that well, but it had tiling.

Maybe I’ll write about it if I get back to hacking Qtile, but that might not happen any time soon because…

Krohnkite

Then I saw the news that the predecessor of Bismuth – the Krohnkite script has been ported to KWin 6 – see the announcement on reddit, github to get and review the code, and kde store for the package you can install.

Huge kudos to all who are involved in the rebirth, the script works as well as it did with KWin 5.

Window decoration

The only thing missing was the simple ‘just a line around the window’ window decoration that Bismuth had.

KWin 6 and Krohnkite + Bismuth decoration

Now we have that as well, I’ve ported the original Bismuth window decoration to KWin 6 (nothing huge, just a few tiny changes to make it compile). The code, and the installation instructions are available on github.

You can support my work on Patreon, or you can get my book Functional Programming in C++ at Manning if you're into that sort of thing. -->
Categories: FLOSS Project Planets

Reproducible Builds (diffoscope): diffoscope 270 released

Planet Debian - Mon, 2024-06-03 20:00

The diffoscope maintainers are pleased to announce the release of diffoscope version 270. This version includes the following changes:

[ Chris Lamb ] * No-change release due to broken version 269 tarballs.

You find out more by visiting the project homepage.

Categories: FLOSS Project Planets

Pages