Feeds
Reproducible Builds (diffoscope): diffoscope 275 released
The diffoscope maintainers are pleased to announce the release of diffoscope version 275. This version includes the following changes:
[ Chris Lamb ] * Update the test_zip.py text fixtures and definitions to support new changes to IO::Compress. (Closes: #1078050) * Do not call marshal.loads(...) of precompiled Python bytecode as it is inherently unsafe. Replace, at least for now, with a brief summary of the code section of .pyc files. (Re: reproducible-builds/diffoscope#371) * Don't bother to check the Python version number in test_python.py: the fixture for this test is deterministic/fixed. * Update copyright years.You find out more by visiting the project homepage.
ImageX: Bold, Sleek, and Endlessly Customizable: the Gin Theme for Next-Level Drupal Admin Experiences
Authored by Nadiia Nykolaichuk.
Generate Python Bindings for C++ code using Shiboken
This will be a guide on how to generate Python bindings for your C++ library using Shiboken. Shiboken is a tool specifically created to build PySide, so it supports Qt code perfectly fine.
The steps described here require the in-progress merge request that adds the necessary code to Extra CMake Modules. I hope that it gets merged soon (I’ll update the post). I’ll use KUnitConversion as an example, because it’s a small library.
We’ll start adding the building instructions. This part is mostly boilerplate code as it’s the same for any library (except for the obvious thing of changing the library name):
1set(bindings_library "KUnitConversion") 2 3set(wrapped_header ${CMAKE_SOURCE_DIR}/python/bindings.h) 4set(typesystem_file ${CMAKE_SOURCE_DIR}/python/bindings.xml) 5 6set(generated_sources 7 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_module_wrapper.cpp 8 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_wrapper.cpp 9 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_converter_wrapper.cpp 10 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_unit_wrapper.cpp 11 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_unitcategory_wrapper.cpp 12 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_updatejob_wrapper.cpp 13 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_value_wrapper.cpp) 14 15set(qt_libs Qt6::Core) 16 17ecm_generate_python_bindings( 18 PACKAGE_NAME ${bindings_library} 19 VERSION ${KF_VERSION} 20 WRAPPED_HEADER ${wrapped_header} 21 TYPESYSTEM ${typesystem_file} 22 GENERATED_SOURCES ${generated_sources} 23 INCLUDE_DIRS ${CMAKE_INSTALL_PREFIX}/${KDE_INSTALL_INCLUDEDIR_KF}/KUnitConversion 24 QT_LIBS ${qt_libs} 25 QT_VERSION ${REQUIRED_QT_VERSION} 26 HOMEPAGE_URL "https://invent.kde.org/frameworks/kunitconversion" 27 ISSUES_URL "https://bugs.kde.org/describecomponents.cgi?product=frameworks-kunitconversion" 28) 29 30target_link_libraries(${bindings_library} PRIVATE KF6UnitConversion) 31install(TARGETS ${bindings_library} LIBRARY DESTINATION "${KDE_INSTALL_LIBDIR}/python-kf6")Let’s see what each part does.
1set(bindings_library "KUnitConversion") 2 3set(wrapped_header ${CMAKE_SOURCE_DIR}/python/bindings.h) 4set(typesystem_file ${CMAKE_SOURCE_DIR}/python/bindings.xml) 5 6set(generated_sources 7 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_module_wrapper.cpp 8 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_wrapper.cpp 9 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_converter_wrapper.cpp 10 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_unit_wrapper.cpp 11 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_unitcategory_wrapper.cpp 12 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_updatejob_wrapper.cpp 13 ${CMAKE_CURRENT_BINARY_DIR}/KUnitConversion/kunitconversion_value_wrapper.cpp) 14 15set(qt_libs Qt6::Core)The first line just defines the name of the Python library we’ll build later. Then we set the header file that includes all the necessary headers of the library. We’ll see that file later. The generated sources list is a bit more complicated, as you need to guess the names of the files generated by Shiboken. Fortunately, you can probably guess the pattern from the example above: the first two files are always the same (the name of the library + _module_wrapper or _wrapper) and the rest is the list of classes defined in your XML file (more about it later). The qt_libs variable just contains a list of the Qt modules that our library requires. You should list all of them, even if one depends on another, because otherwise Shiboken won’t be able to find the include directories.
17ecm_generate_python_bindings( 18 PACKAGE_NAME ${bindings_library} 19 VERSION ${KF_VERSION} 20 WRAPPED_HEADER ${wrapped_header} 21 TYPESYSTEM ${typesystem_file} 22 GENERATED_SOURCES ${generated_sources} 23 INCLUDE_DIRS ${CMAKE_INSTALL_PREFIX}/${KDE_INSTALL_INCLUDEDIR_KF}/KUnitConversion 24 QT_LIBS ${qt_libs} 25 QT_VERSION ${REQUIRED_QT_VERSION} 26 HOMEPAGE_URL "https://invent.kde.org/frameworks/kunitconversion" 27 ISSUES_URL "https://bugs.kde.org/describecomponents.cgi?product=frameworks-kunitconversion" 28)This is the magic part. The ecm_generate_python_bindings function takes care of running Shiboken with all the required arguments, building the Python library and a wheel file to publish it on the Python Package Index (pypi.org). It has the following arguments:
- PACKAGE_NAME: Name of the Python library.
- VERSION: Version of the resulting library.
- WRAPPED_HEADER: The header file we talked about above.
- TYPESYSTEM: XML file with the type system information.
- GENERATED_SOURCES: The list of files that Shiboken will generate.
- INCLUDE_DIRS: A list with additional include directories that are required for the library. It’s typically the location where the header files included in the WRAPPER_HEADER are.
- QT_LIBS: The list of Qt libraries as explained above.
- QT_VERSION: The minimum required Qt version.
- HOMEPAGE_URL: A URL to the homepage of the project.
- ISSUES_URL: A URL where users can report bugs.
The last part links the C++ library with the Python bindings and installs it. Now let’s take a look at the header file:
1#pragma once 2 3// Make "signals:", "slots:" visible as access specifiers 4#define QT_ANNOTATE_ACCESS_SPECIFIER(a) __attribute__((annotate(#a))) 5 6#include <KUnitConversion/Converter> 7#include <KUnitConversion/Unit> 8#include <KUnitConversion/UnitCategory> 9#include <KUnitConversion/Value>Nothing exciting there, just the list of includes (and some Qt thing that I don’t understand).
The last file you need is the typesystem definition, where you tell Shiboken which things (classes, structs, enums, namespaces…) you want to include in your bindings and how it should interpret them. You can delete or rename functions, change the return type, modify the input parameters and many other things. You may want to take a look at the documentation because the list of posible options is very large.
1<?xml version="1.0"?> 2<typesystem package="KUnitConversion"> 3 <load-typesystem name="typesystem_core.xml" generate="no" /> 4 5 <namespace-type name="KUnitConversion"> 6 <enum-type name="CategoryId" /> 7 <object-type name="Converter" /> 8 <object-type name="Unit" /> 9 <object-type name="UnitCategory" /> 10 <enum-type name="UnitId" /> 11 <object-type name="UpdateJob" /> 12 <object-type name="Value" /> 13 </namespace-type> 14</typesystem>You need to load the typesystems of the Qt libraries that you are using so Shiboken can understand what your code is referring to. They come included with PySide.
That’s all you need to generate the Python bindings for your library. The last step is building the project as you usually do.
Joerg Jaspert: Electric Car, Vacation trip
A while ago I got my hands on an electric car - after not having owned a car for most of my life (there really is not much need here). It wasn’t planned nor a goal of mine, but it kind of “came out of talks with my boss”, so now it’s there.
Due to some special rules in the german tax system it turns out really cheap for me - it comes from the company, which allows personal use. So I have to pay taxes on the value of it plus whichever amount of kilometers I have to drive to work. And the latter is what is good for me - I have homeoffice in my contract, so no drive to work, except maybe once a year for something. So no regular trip to calculate, only if I ever really have to do a trip to the office.
And it being an electric car, running costs are also cheap. Way below the outdated tech that needs gas to run, is annoyingly loud and stinks.
The carIn the past I used either car sharing or renting a car when I needed one, depending on what I actually needed. Last times renting I already tried electric variants, so I could compare this with.
What I have now is a “Citroen e-Berlingo XL” from 2023 (so not the latest change from 2024), which is on the huge size for space, but small for battery. It has 7 seats (though I have the last 2 currently taken out, no daily need) and more storage space than I need even on a vacation trip.
The engine (or well, it’s battery) is on the small side - a capacity of 50 kWh means it only has 278km reach according to WLTP. That actually is a good bit less - as usual, those numbers are lying for the producer. Turns out that on highways in a more realistic mode than WLTP (read: real driving) it’s somewhat around 120km before one wants a charger again. But, looking around, at least in Germany that is not a big problem, there are more than enough chargers available.
DrivingActually driving experience is good. It sure is a huge car (4.7m long, 1.85m high/wide) and feels more like driving a (small) bus, but it is easy to handle. Maximum speed is limited (or the small battery would suck even more) to 135km/h, but that is more than enough. Even on german highways. Did my vacation trip with cruise control set to 115km/h and very few times only went above that manually. Real relaxed driving that was.
Vacation tripSo we had a vacation just recently, and instead of renting a car for the trip we, of course, wanted to take the e-Berlingo. Distance was about double what the car can (realistically!) do, so one charging stop in the middle somewhere was a must. Not having had to charge on highways yet - and entirely new with this car - that made for a bit of nervousness, but it all turned out really good. There are really nice tools like ABRP to plan your trip including charging, which can take live data of availability of charging points into the planning.
And it turned out nice - we reached our planned charging point and found a long queue of cars waiting. But turns out it was all those poor folks that need actual gasoline for their outdated combustion engines. The charging points for EV cars still had enough free space, so we could bypass the queue and directly start charging. We also did not need to repark the car after just a few minutes, we could directly start our break.
With a charging time of approx. 30 minutes using the fast charger, such a break is long enough to get enough energy for the next part of the trip and short enough to not be annoying.
Charging pricesAt home it depends. If one has some photovoltaic system to get power, charging is basically free. If not it depends on whichever contract one has, costs will be somewhat between 0 and ~30cent per kWh. Not much, and way below gasoline costs.
Outside, using a fast charger, prices vary depending on where you charge - and with what charging card. Prices between 40 and 70cent / kWh, and the same charging point can vary, just from the card one uses. That is a thing that the EU could actually go and better regulate, similar to the phone regulations it took. Still, the costs are still way below gasoline.
Charging cardsThere is a huge amount of different providers available, and all do their own things in pricing and how one can use them. They do have standards (say, the plugs are standardized, by now the way to start charging also), and that enables roaming (use a charging card of one provider at a charging point of another), but other than that, it seems to be random.
That is - if you use card A on a charging point of Provider B you may pay 0.49cents, if you use card C on the same point, it may charge you 0.79cents. And card D isn’t taken at all. Some (the newer ones) you can pay directly by credit card, many you can’t. Some may allow paypal or Google/Apple Pay. So in the end you need more than just one charging card - I collected 8 free ones by now - just to be sure you can find a combination that isn’t hugely overpriced.
Drupal blog: Drupal CMS: the official name for Drupal Starshot
This blog has been re-posted and edited with permission from Dries Buytaert's blog.
We're excited to announce that "Drupal CMS" will be the official name for the product developed by the Drupal Starshot Initiative.
The name "Drupal CMS" was chosen after user testing with both newcomers and experienced users. This name consistently scored highest across all tested groups, including marketers unfamiliar with Drupal.
Participants appreciated the clarity it brings:
Having the words CMS in the name will make it clear what the product is. People would know that Drupal was a content management system by the nature of its name, rather than having to ask what Drupal is.
I'm a designer familiar with the industry, so the term CMS or content management system is the term or phrase that describes this product most accurately in my opinion. I think it is important to have CMS in the title.
The name "Drupal Starshot" will remain an internal code name until the first release of Drupal CMS, after which it will most likely be retired.
Mastering Cross-platform Desktop Apps
Creating applications for cross-platform compatibility is a modern best practice. It increases deployment flexibility and allows applications to reach a wider audience. However, doing it properly can involve some trial and error. At KDAB, we’ve built many multiplatform desktop applications. Here, we’ve compiled a few insights from that process to help you build better software.
The case for multiplatform applicationsIf you’re not yet sold on building multiplatform applications, here are three straightforward reasons why it makes sense for both you and your users.
- Bigger audience: The most obvious reason for creating a multiplatform application is to reach a bigger audience. Why narrow your potential market share from the start? Targeting all main platforms makes sure you don’t exclude potential users.
- Compiler diversity: Compiling your application across multiple platforms requires using different compilers. This pushes you towards writing standards-compliant code and enhances software quality by exposing compiler-specific warnings and errors.
- Environment consistency: Restricting your builds to a single OS tailors the application to the peculiarities of that environment. Maybe that’s okay in the short term, but if (or when) you move your software, it will create problems.
Setting up the CI build system to only build for the platform you’re actively developing and testing on might seem to save time. However, we recommend that your CI system always builds all platforms. This approach ensures that changes work across all operating systems, preventing developers on other platforms from having to fix your bugs, which is inefficient and error prone.
Why you shouldn’t ignore LinuxMany companies provide Windows and Mac variants of their applications but leave Linux out of the picture. We suggest including Linux for several reasons.
First, if you’re using tools that target both Windows and Mac, adding Linux isn’t too big of a stretch. Even if your main user base doesn’t include many Linux users, you’ve certainly got Linux lovers among your developers. Developers on your team will appreciate the ability to use your application on their favorite platform. Most importantly, the Linux ecosystem is rich with open-source tools for debugging, profiling, and optimizing applications, which can significantly enhance developer productivity and code robustness. Building on all three main platforms ensures you have the widest array of tools to find issues and improve your software.
Multiplatform toolsWhat tools make sense for creating the best multiplatform environment? Here are a few to consider:
- Analyzers, linters, and profilers: Regular use of linters and static analyzers such as Clang Static Analyzer, Clang-Tidy, Clazy, and SonarQube helps maintain code quality by detecting potential errors before they become problematic, while profilers and performance analyzers like Perf, Valgrind, VTune, and Hotspot help find and fix difficult bugs.
- Cross-platform development environments: IDEs such as VS Code, Qt Creator, and Eclipse offer extensive support for multiplatform development. These IDEs can launch multiplatform build environments like CMake, allowing both visual and command-line control over your software development process.
- Containerization and virtualization: Tools like Docker, Kubernetes, VirtualBox, or other VMs can simulate different operating environments on a single hardware platform. This is great for quickly spinning up developers and new development environments and makes for efficient cross-platform testing.
The primary benefit of multiplatform is the flexibility it provides. This flexibility doesn’t just let users choose their platform of choice. It also allows developers to access tools from across the development spectrum and run in environments where they’re most efficient. If this sounds interesting, check out our desktop best practice guide.
About KDAB
If you like this article and want to read similar material, consider subscribing via our RSS feed.
Subscribe to KDAB TV for similar informative short video content.
KDAB provides market leading software consulting and development services and training in Qt, C++ and 3D/OpenGL. Contact us.
The post Mastering Cross-platform Desktop Apps appeared first on KDAB.
Matt Layman: PDF Text Extraction With Python
Drupal Starshot blog: Drupal CMS: the official name for Drupal Starshot
We're excited to announce that "Drupal CMS" will be the official name for the product developed by the Drupal Starshot Initiative.
The name "Drupal CMS" was chosen after user testing with both newcomers and experienced users. This name consistently scored highest across all tested groups, including marketers unfamiliar with Drupal.
Participants appreciated the clarity it brings:
Having the words CMS in the name will make it clear what the product is. People would know that Drupal was a content management system by the nature of its name, rather than having to ask what Drupal is.
I'm a designer familiar with the industry, so the term CMS or content management system is the term or phrase that describes this product most accurately in my opinion. I think it is important to have CMS in the title.
The name "Drupal Starshot" will remain an internal code name until the first release of Drupal CMS, after which it will most likely be retired.
— Dries Buytaert
Plasma Dialer 24.08 is out
After a long wait, Plasma Dialer 24.08 is finally out. This released is based on Qt6 and contains 17 months of bug fixing as well as small improvements all other the place.
Packager Section
You can find the package on download.kde.org and it has been signed with my Carl's GPG key.
Tag1 Consulting: Migrating Your Data from Drupal 7 to Drupal 10: Preparing for field migrations
Series Overview & ToC | Previous Article | Next Article - coming soon! --- So far we have migrated three entity types: content types, taxonomy vocabularies, and paragraphs. It is very common that fields are attached to those, and other entities, to collect and display data. Field migrations can be tricky. For one, it is a multi-step process that requires, at a minimum, four different migrations. Additionally, it is common to find errors because field related configuration used in Drupal 7 is not available in Drupal 10. In this article, we take a pause from executing migrations to understand how fields work in Drupal. The information presented today will prove useful for custom migrations, especially when they include content model changes. ## Understanding Drupal fields Drupal fields are used to provide structure to the information the CMS stores. They save discrete data, which can be used for displaying, filtering, and sorting purposes. Fields are attached to entities like nodes, users, taxonomy terms, blocks, etc. For entities that can have bundles, each bundle can have a different set of fields attached to them. The node entity, for example, almost always has a different set of fields attached to each content type...
Read more mauricio Wed, 08/14/2024 - 15:40Real Python: The Walrus Operator: Python's Assignment Expressions
Each new version of Python adds new features to the language. Back when Python 3.8 was released, the biggest change was the addition of assignment expressions. Specifically, the := operator gave you a new syntax for assigning variables in the middle of expressions. This operator is colloquially known as the walrus operator.
This tutorial is an in-depth introduction to the walrus operator. You’ll learn some of the motivations for the syntax update and explore examples where assignment expressions can be useful.
In this tutorial, you’ll learn how to:
- Identify the walrus operator and understand its meaning
- Understand use cases for the walrus operator
- Avoid repetitive code by using the walrus operator
- Convert between code using the walrus operator and code using other assignment methods
- Use appropriate style in your assignment expressions
Note that all walrus operator examples in this tutorial require Python 3.8 or later to work.
Get Your Code: Click here to download the free sample code that shows you how to use Python’s walrus operator.
Take the Quiz: Test your knowledge with our interactive “The Walrus Operator: Python's Assignment Expressions” quiz. You’ll receive a score upon completion to help you track your learning progress:
Interactive Quiz
The Walrus Operator: Python's Assignment ExpressionsIn this quiz, you'll test your understanding of the Python Walrus Operator. This operator was introduced in Python 3.8, and understanding it can help you write more concise and efficient code.
Walrus Operator FundamentalsFirst, look at some different terms that programmers use to refer to this new syntax. You’ve already seen a few in this tutorial.
The := operator is officially known as the assignment expression operator. During early discussions, it was dubbed the walrus operator because the := syntax resembles the eyes and tusks of a walrus lying on its side. You may also see the := operator referred to as the colon equals operator. Yet another term used for assignment expressions is named expressions.
Hello, Walrus!To get a first impression of what assignment expressions are all about, start your REPL and play around with the following code:
Python 1>>> walrus = False 2>>> walrus 3False 4 5>>> (walrus := True) 6True 7>>> walrus 8True Copied!Line 1 shows a traditional assignment statement where the value False is assigned to walrus. Next, on line 5, you use an assignment expression to assign the value True to walrus. After both lines 1 and 5, you can refer to the assigned values by using the variable name walrus.
You might be wondering why you’re using parentheses on line 5, and you’ll learn why the parentheses are needed later on in this tutorial.
Note: A statement in Python is a unit of code. An expression is a special statement that can be evaluated to some value.
For example, 1 + 2 is an expression that evaluates to the value 3, while number = 1 + 2 is an assignment statement that doesn’t evaluate to a value. Although running the statement number = 1 + 2 doesn’t evaluate to 3, it does assign the value 3 to number.
In Python, you often see simple statements like return statements and import statements, as well as compound statements like if statements and function definitions. These are all statements, not expressions.
There’s a subtle—but important—difference between the two types of assignments with the walrus variable. An assignment expression returns the value, while a traditional assignment doesn’t. You can see this in action when the REPL doesn’t print any value after walrus = False on line 1 but prints out True after the assignment expression on line 5.
You can see another important aspect about walrus operators in this example. Though it might look new, the := operator does not do anything that isn’t possible without it. It only makes certain constructs more convenient and can sometimes communicate the intent of your code more clearly.
Now you have a basic idea of what the := operator is and what it can do. It’s an operator used in assignment expressions, which can return the value being assigned, unlike traditional assignment statements. To get deeper and really learn about the walrus operator, continue reading to see where you should and shouldn’t use it.
ImplementationLike most new features in Python, assignment expressions were introduced through a Python Enhancement Proposal (PEP). PEP 572 describes the motivation for introducing the walrus operator, the details of the syntax, and examples where the := operator can be used to improve your code.
This PEP was originally written by Chris Angelico in February 2018. Following some heated discussion, PEP 572 was accepted by Guido van Rossum in July 2018.
Since then, Guido announced that he was stepping down from his role as benevolent dictator for life (BDFL). Since early 2019, the Python language has been governed by an elected steering council instead.
The walrus operator was implemented by Emily Morehouse, and made available in the first alpha release of Python 3.8.
Motivation Read the full article at https://realpython.com/python-walrus-operator/ »[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
Lukas Märdian: Netplan v1.1 released
I’m happy to announce that Netplan version 1.1 is now available on GitHub and is soon to be deployed into a Debian and/or Ubuntu installation near you! Six months and 120 commits after the previous version (including one patch release v1.0.1), this release is brought to you by 17 free software contributors from around the globe.
Kudos to everybody involved!
Highlights- Custom systemd-networkd-wait-online logic override to wait for link-local and routable interfaces. (#456, #482)
- Modification of the embedded-switch-mode setting without virtual-function (VF) definitions on SR-IOV devices (#454)
- Parser flag to ignore individual, broken configurations, instead of not generating any backend configuration (#412)
- Fixes for @ProtonVPN (#495) and @microsoft Azure Linux (#445), contributed by those companies
- CI: adopt autopkgtest for 1.0-1 on 22.04 by @slyon in #446
- tools/keyfile_to_yaml: display the generated YAML by @daniloegea in #452
- tests: import the config fuzzing tests by @daniloegea in #453
- ATTN: parse/bonds: handle same primary in multiple bonds by @daniloegea in #451
- sriov: accept setting the eswitch mode without VFs (LP#2020409) by @daniloegea in #454
- Custom systemd-networkd-wait-online override to wait on interfaces. (Closes: #1008995) (LP#2060311) by @slyon in #456
- Ignore bad NetDefs and files via parser flags by @daniloegea in #412
- networkd:apply: Drop handling of legacy wpa@ instance units by @slyon in #471
- migrate: support aliases by @Kristof0127 in #473
- networkd: add ipv6 ra overrides (LP#1973222) by @KhooHaoYit in #461
- netplan status –diff fixes and improvements by @daniloegea in #466
- apply: make sure that networkd is restarted when needed by @alfonsosanchezbeato in #449
- Don’t escape certain non-ascii characters by @daniloegea in #486
- networkd: make s-n-wait-online wait for at least one routable interface by @slyon in #482
- networkd: Implement ipv6-address-generation: stable-privacy by @tatokis in #480
- Implementing advmss ip route option by @barvius in #489
- meson: Add ‘testing’ option by @slyon in #493
- Add a scheduled workflow to run TICS by @daniloegea in #498
- ci: migrate to Ubuntu 24.04 by @daniloegea in #465
- Prepare Netplan v1.1 by @slyon in #504
- Fix wrong syntax in example by @fzakfeld in #459
- Tutorial improvements by @rkratky in #458
- added guide for contributing to the netplan documentation by @ade555 in #457
- Add initial SECURITY.md policy by @slyon in #478
- Create single-nic-vm-host.md by @ilvipero in #475
- Create single-nic-vm-host-with-vlans.md by @ilvipero in #476
- Create multi-nic-vm-host-with-bonds-and-vlans.md by @ilvipero in #477
- bullet point removal by @shirleyherox in #483
- Add netplan try to netplan tutorial by @davidekete in #494
- Update the docs checks runner to ubuntu-latest by @rkratky in #500
- Add spelling exceptions by @rkratky in #499
- Fix logging setup when python-rich is not present by @frhuelsz in #445
- parse-nm: add a workaround for the DoT DNS option (LP#2055148) by @daniloegea in #447
- parse: don’t remove datalist items during iteration by @daniloegea in #450
- parse: fix redefinition of gateway(4|6) by @daniloegea in #460
- python: elements of all must be strings by @daniloegea in #464
- CI: Fix DebCI check, using newer ‘meson’ from unstable by @slyon in #467
- tests: fix diff test with iproute2 6.8 by @daniloegea in #469
- cli/generate: skip daemon_reload with –mapping by @daniloegea in #470
- CI: fork spread to get snapcore/spread#179 fixes by @slyon in #472
- ctests: fix a memory leak in a unit test by @daniloegea in #474
- nm/nd: fix a couple of crashes by @daniloegea in #468
- test:integration: Try to improve test flakyness (Closes: #1069871) by @slyon in #481
- Security fixes (CVE-2022-4968) by @daniloegea in #484
- emitter: allow unicode characters in the emitter (LP#2071652) by @daniloegea in #485
- CLI:apply: call udevadm trigger, using –action=move (Closes: #1071220) (LP#2066344, LP#2071363) by @slyon in #479
- CI: fix CodeQL permissions by @slyon in #491
- ci: run meson tests with unbuffer by @daniloegea in #501
- ci/tics: install “expect” as a dependency by @daniloegea in #502
- generate: avoid calling ‘udevadm control –reload’ (LP#1999178) by @slyon in #488
- netplan ignores NetworkManager ipv4.route-metric (LP#2076172) by @calexandru2018 in #495
- Change default umask when creating dirctories (LP#2076319) by @rmalz-c in #497
- @frhuelsz made their first contribution in #445
- @fzakfeld made their first contribution in #459
- @Kristof0127 made their first contribution in #473
- @ade555 made their first contribution in #457
- @KhooHaoYit made their first contribution in #461
- @ilvipero made their first contribution in #475
- @shirleyherox made their first contribution in #483
- @tatokis made their first contribution in #480
- @barvius made their first contribution in #489
- @davidekete made their first contribution in #494
- @calexandru2018 made their first contribution in #495
- @rmalz-c made their first contribution in #497
Full Changelog: 1.0…1.1
The Drop Times: Steering Drupal’s Potential in Academia: Janna Malikova
The Drop Times: The Key to Rebuilding Drupal Communities: What Will Be Revealed at GovCon 2024?
Real Python: Quiz: The Walrus Operator: Python's Assignment Expressions
In this quiz, you’ll test your understanding of the Python Walrus Operator. This operator, used for assignment expressions, was introduced in Python 3.8 and can be used to assign values to variables as part of an expression.
[ Improve Your Python With 🐍 Python Tricks 💌 – Get a short & sweet Python Trick delivered to your inbox every couple of days. >> Click here to learn more and see examples ]
Python Anywhere: Postal code validation for card payments
We recently started validating that the postal codes used for paid PythonAnywhere accounts match the ones that people’s banks have on file for the card used. This has led to some confusion, in particular because banks handle postal code validation in a complicated way – charges that fail because of this kind of error can show up in your bank app as a payment that then disappears later, or even as a charge followed by a refund. This blog post is to summarise why that is, so hopefully it will make things a bit less confusing!
The long version…Card fraud is, sadly, a fact of life on the Internet. If you have a website that accepts payments, eventually someone will try to use a stolen card on it. If your site is online for some time, hackers might even start using you to test lists of stolen cards – that is, they don’t want to use your product in particular, they’re just trying each of the cards to find the ones that are valid, so that they can use them elsewhere.
We recently saw an uptick in the number of these “card probers” (as we call them internally) on PythonAnywhere. We have processes in place to identify them, so that we can refund all payments they get through, and report them as fraudulent to Stripe – our card processor – so that the cards in question are harder for them to use on other sites. But this takes time – time which we would much rather spend on building new features for PythonAnywhere.
Looking into the recent charges, we discovered that many of them were using the wrong postal code when testing the cards. The probers had the numbers, the expiry dates, the CVVs, but not the billing addresses. So we re-introduced something that had been disabled on our Stripe account for some time: postal code validation for payments. You may be wondering why it wasn’t enabled already, or why it might even be something that anyone would disable; this blog post is an introduction to why postal codes and card payments can be more complicated than you might think.
Paolo Melchiorre: Python Software Foundation fellow member
The Python Software Foundation made me a PSF fellow member, along with Adam Johnson.
PyCharm
The new and improved AI Assistant for the 2024.2 versions of JetBrains IDEs is now out, featuring smarter and faster AI code completion for Java, Kotlin, and Python; an enhanced UX when working with code in the editor; AI functionality for Git conflict resolution, in-terminal code generation, new customizable prompts, improved test generation, and more.
Don’t have AI Assistant yet?To experience the latest enhancements, simply open a project in your preferred JetBrains IDE version 2024.2, click the AI icon on the right toolbar to initiate the installation, and follow the instructions to enable it.
You can also experience free local AI completion functionality with full line code completion (FLCC) in your IDE of choice, including CLion and Rider starting from 2024.2. Learn more about FLCC in this blog post.
Faster and smarter cloud code completionOne of the main focuses of this release was to enhance the user experience of AI code completion in JetBrains IDEs. Here are some of the major advances we’ve made in this direction:
JetBrains code completion models for Python, Java, and KotlinWe’ve significantly improved the quality and reduced the latency of our code completion for Java, Kotlin, and Python. These enhancements are powered by JetBrains’ internally trained large language models. Enhanced locations for cloud completion invocation extend the variety of usage scenarios, while improved suffix matching ensures that the predicted code snippet correctly completes the existing code.
Syntax highlighting for suggested codeInline code completion suggestions now come with syntax highlighting, improving the readability of the suggested code.
Incremental acceptance of code suggestionsTo simplify the process of reviewing suggestions, multiline code suggestions are now displayed only after accepting a single-line suggestion, allowing you to review and accept code gradually. Additionally, if you don’t want to accept an entire suggested line, you can accept it word by word using the same shortcut that you’d typically use to move the caret to the following word (Ctrl+→ for Windows and ⌥→ for macOS).
Seamless interaction of all available code completion typesWe have made UX improvements to better integrate AI code completion features into IDE workflows. This includes a reworked UX for multiline completion and the ability to display suggestions alongside basic IDE completions.
Enhanced in-editor code generationWith the latest update, JetBrains IDEs now feature an improved AI code generation experience. Previously, generated code would open in a new tab. Now, it’s displayed directly in the current editor tab, allowing for an immediate review of the generated content. Check it out using the shortcut ⌘\ on macOS or Ctrl+\ on Windows and Linux.
AI chat becomes smarter GPT-4o supportWith the new release, AI Assistant now supports the latest GPT-4o model, bringing a boost to the AI Assistant’s chat-related functionalities, such as finding and explaining errors, explaining code, and refactoring.
Chat references and commandsWe have introduced chat references and commands to enhance your AI Assistant’s chat experience, giving you more control over your context. Now, you can reference any symbols, allowing you to quickly indicate the context of your query and get more precise responses. Additionally, you can easily mention specific files or uncommitted local changes. Supported commands include /explain and /refactor, allowing you to quickly get explanations or refactor selected code without typing out questions in the chat.
New feature: merge VCS conflicts with AIWhen multiple contributors are making changes to the same part of the codebase, and you try to pull your changes, conflicts may arise. To avoid any issues down the line, JetBrains IDEs now provide a tool for reviewing and resolving any such conflicts. Starting from version 2024.2, the Git conflict resolution modal dialog features AI capabilities to assist with merging conflicts. After AI has done its job, you can review the merged result and either accept everything or revert the changes individually.
New feature: AI-powered command generation in the new TerminalGenerate commands with AI directly in your IDE via the new Terminal tool window. This integration ensures you can efficiently complete command-line tasks without distraction, improving your overall workflow.
Enhanced unit test generation with AI AssistantStarting from version 2024.2, the Generate Unit Tests action can be invoked not only on methods but also on classes. If a class has multiple methods, the AI will automatically choose the most suitable one for testing. The latest update also includes more customization options for unit test generation.
Customizable unit test guidelinesUsers can set their own unit test guidelines by customizing the test generation prompt in the AI Assistant’s Prompt Library. This allows you to add specific testing rules for Java, Kotlin, JavaScript, Go, Python, PHP, and Ruby.
Adding test cases to existing testsAI Assistant now supports adding new test cases to existing test files for Java and Kotlin, allowing you to generate new tests using AI.
Сustom prompts for documentation generationThe latest update to JetBrains IDEs introduces customizable documentation generation prompts. This feature allows the model to generate documentation for a selected code element and inserts it directly into the code. Users can now define the desired content of the generated documentation for different languages and specify various formatting options, such as Javadoc for Java, ensuring the documentation adheres to preferred styles and standards.
Natural Language settingYou can now specify the language in which you want to interact with the AI chat via Settings. After enabling the Natural Language setting, the context of the current chat will be updated, and any new answers generated by the AI will be provided in the user’s chosen language.
Using AI for working with databasesThe new release brings AI to a variety of database-specific features within JetBrains IDEs. You can try these out in DataGrip or in a JetBrains IDE of your choice using the bundled Database Tools and SQL plugin.
Get AI assistance when modifying tablesAI Assistant can now help you change the database-specific parameters of a table. Ask AI Assistant to modify a table according to your requirements right in the Modify dialog. Once AI Assistant generates the requested SQL code, you’ll be able to review it in the preview pane of the dialog and then apply the changes.
Explain and fix SQL problems
DataGrip’s code inspections detect various issues with your SQL queries before execution, which are then categorized according to predefined severity levels.
The latest update integrates AI to enhance the comprehension and resolution of SQL problems. For issues with a severity level higher than Weak warning, the AI Assistant offers explanations and fixes. For better context and more accurate suggestions, you can also attach your database schema.
AI Enterprise: unlocking organizational productivityAre you looking to maximize productivity at an organizational scale? AI Enterprise runs on premises as part of JetBrains IDE Services, ensuring complete control over data and AI operations within your organization’s infrastructure. It also provides AI usage statistics and reports, offering insights into how AI tools are utilized across your development teams. Learn more about AI Enterprise.
Enhance your writing with Grazie, now included in the AI Pro subscription planWe’re excited to share that Grazie, our AI writing companion for people in tech, is now included in the AI Pro subscription plan. Use Grazie to transform your thoughts into clear, well-articulated writing, with features like instant proofreading, inline text completion, summarization, translation, rephrasing, and more!
Grazie is now available as a plugin for your JetBrains IDEs and as an extension for browsers. While there is a free version, AI Pro subscribers enjoy full volume access to the entire suite of Grazie’s AI features, which is 500 times greater than the basic volume and replenishes weekly.
Explore AI Assistant and share your feedbackYou can learn more about AI Assistant’s key features here. However, the best way to explore its capabilities is by trying it out yourself.
As always, we look forward to hearing your feedback. You can also tell us about your experience via the Share your feedback link in the AI Assistant tool window or by submitting feature requests or bug reports in YouTrack.
Happy developing!
PyCoder’s Weekly: Issue #642 (Aug. 13, 2024)
#642 – AUGUST 13, 2024
View in Browser »
This is part 9 in an in-depth series on testing. This part talks about using coverage tools to check how much of your code gets executed during tests, and how to use the nox tool to test against a matrix of Python and dependency versions.
BITE CODE!
In this tutorial, you’ll learn how to create and use asynchronous iterators and iterables in Python. You’ll explore their syntax and structure and discover how they can be leveraged to handle asynchronous operations more efficiently.
REAL PYTHON
Sounds tricky right? Well that’s exactly what Kraken Technologies is doing. Learn how they manage 100s of deployments a day and how they handle errors when they crop up. Sneak peak: they use Sentry to reduce noise, prioritize issues, and maintain code quality–without relying on a dedicated QA team →
SENTRY sponsor
This explains how to automatically attach contextual information to all the log messages coming out of a Python ASGI application.
REDOWAN DELOWAR • Shared by Redowan Delowar
The upcoming release of Python 3.13 has some great new features in the REPL making it easier to move around and edit your code. One feature is the ability to move word by word using CTRL+left and right arrow keys. Unfortunately, macOS traps these keys. This quick TIL post shows you how to fix that.
RODRIGO GIRÃO SERRÃO
In this tutorial, you’ll learn how to use Python’s rich set of operators and functions for working with strings. You’ll cover the basics of creating strings using literals and the str() function, applying string methods, using operators and built-in functions with strings, and more!
REAL PYTHON
What hurdles must be cleared when starting an international organization? How do you empower others in a community by sharing responsibilities? This week on the show, we speak with Jay Miller about Black Python Devs.
REAL PYTHON podcast
Nat is a consultant which means they spend a lot of time reading other people’s code. When trying to understand large systems taking notes along the way can help. This post talks about the techniques Nat uses.
NAT BENNETT
Have you ever needed a progress bar in your Python command-line application? One great way of creating a progress bar is to use the alive-progress package. This article shows you how.
MIKE DRISCOLL
PyCon US 2024 had a record breaking attendance with over 2,700 in-person tickets sold. This article is a recap from the conference runners and links to all the available recordings.
PYCON
This post describes Knuckledragger, a Z3 based semi-automated proof assistant. The post covers the basic design, applications, and a variety of theory types it can work with.
PHILIP ZUCKER
Evan has been using the ast.parse function a lot and has found it to be slow even though it is built as a C extension. This article digs into what is going on.
EVAN DOYLE
SonarQube is a freely available static code analysis tool. This article shows you what it can do and how to get it going on your system.
PRINCE ONYEANUNA
This post talks about the __all__ attribute and how it declares the public interface to a module, but does not enforce access.
CAELEAN BARNES
This practical article on regular expressions shows you how to build regexes to parse the logs from the nginx web server.
JUHA-MATTI SANTALA
Stephen uses a story-telling style to explain how operator precedence works in Python.
STEPHEN GRUPPETTA • Shared by Stephen Gruppetta
GITHUB.COM/POMPONCHIK • Shared by Evgeniy Blinov (pomponchik)
Events Weekly Real Python Office Hours Q&A (Virtual) August 14, 2024
REALPYTHON.COM
August 15, 2024
MEETUP.COM
August 15, 2024
PYLADIES.COM
August 16 to August 17, 2024
SHEDEVSPYTHONWORKSHOP.CO.ZW
August 21 to August 23, 2024
PYCON.ORG.SO
August 23 to August 26, 2024
KIWIPYCON.NZ
Happy Pythoning!
This was PyCoder’s Weekly Issue #642.
View in Browser »
[ Subscribe to 🐍 PyCoder’s Weekly 💌 – Get the best Python news, articles, and tutorials delivered to your inbox once a week >> Click here to learn more ]
Jonathan Dowland: ouch,_part_2
Things developed since my last post. Some lesions opened up on my ankle which was initially good news: the pain substantially reduced. But they didn’t heal fast enough and so medics decided on surgical debridement. That was last night. It seemed to be successful and I’m in recovery from surgery as I write. It’s hard to predict the near-future, a lot depends on how well and fast I heal.
I’ve got a negative-pressure dressing on it, which is incredible: a constantly maintained suction to aid in debridement and healing. Modern medicine feels like a sci fi novel.