FLOSS Project Planets

New Style Signal/Slot Connection

Planet KDE - Mon, 2019-06-17 05:16

Yes, I know. The last post on the assistants is rather boring. And yet these days I have been working on the snapshot docker, though it still seems a little (just a little, you see) unfinished as Dmitry is said to experience a relatively high delay when switching between snapshots. However this is not what I can reproduce on my older laptop, so I am really waiting for his test results in order to further investigate the problem.

But there is something interesting happening just when I am randomly testing things. FromKrita’s debug output, I saw QObject::connect() complaining about the arguments I passed, saying it is expecting parenthesis. “Okay,” I thought, “then there have to be something wrong with the code I wrote.” And that was quite confusing. I remember having used member function pointers in those places, got a compile-time error since KisSignalAutoConnectionsStore did not support the new syntax,then switched back to the SINGAL() and SLOT() macros. KisSignalAutoConnectionsStore is a helper class to quickly (dis)connect a group of connections. One can use the addConnection() method to add a connection, and use clear() to remove all connections made before.

Well, everything good, apart from the fact that I missed the parenthesis, which I did not discover until I looked into the debug output. So I asked Dmitry why not add the new syntax to KisSignalAutoConnectionsStore, and he said we should.

What is good about the new syntax is compile-time checking. We probably do not want our connectionsto fail to be made only when you run the program, just because there is a typo in the signature. That is definitely tiring and hard to catch (hmm, I did not notice the problem until today I randomly glanced at the command line; it might be worse if I shipped the snapshot docker together with those careless bugs).

The modification to the code seems straightforward. All what happens is in the KisSignalAutoConnection class. In its constructor, the connection is made using QObject::connect(); in its destructor, the connection is removed by passing the same sets of arguments to QObject::disconnect() currently in master. The signature is just KisSignalAutoConnection(const QObject *, const char *, const QObject *, const char *), as SIGNAL() and SLOT() macros are but to append their arguments to the string "1" and "2" respectively.

So the problem we have is we do not want the arguments that specify the signals and/or slots to be just strings. We want them to be pointers to member functions, or maybe lambdas. According to QObject document, the signature for new-style connect() is:

1
QMetaObject::Connection QObject::connect(const QObject *sender, PointerToMemberFunction signal, const QObject *context, Functor functor, Qt::ConnectionType type = Qt::AutoConnection)

Okay, so we know that sender and receiver should be pointers to QObjects, and either the type of signal or functor we do not know. Now let’s make our KisSignalAutoConnection constructor a template function:

1
2
3
4
template<class Signal, class Method>
inline KisSignalAutoConnection(const QObject *sender, Signal signal,
const QObject *receiver, Method method
Qt::ConnectionType type = Qt::AutoConnection);

But when these parameters are passed to QObject::connect(), we get a compile-time error, saying there is no matching overload for connect().

Why?

The answer is the Qt documentation is simplifying, if not hiding, the truth. The real definition for connect() is found in Line 227 of qobject.h:

1
2
3
4
template <typename Func1, typename Func2>
static inline QMetaObject::Connection connect(const typename QtPrivate::FunctionPointer<Func1>::Object *sender, Func1 signal,
const typename QtPrivate::FunctionPointer<Func2>::Object *receiver, Func2 slot,
Qt::ConnectionType type = Qt::AutoConnection)

And tracking down the definition of QtPrivate::FunctionPointer, we get it in qobjectdefs_impl.h:

1
2
3
4
5
template<class Obj, typename Ret, typename... Args> struct FunctionPointer<Ret (Obj::*) (Args...)>
{
typedef Obj Object;
...
};

And seeing what we have passed to KisSignalAutoConnection (in the test code):

1
2
KisSignalAutoConnectionsStore conn;
conn.addConnection(test1.data(), &TestClass::sigTest1, test2.data(), &TestClass::slotTest1);

We can see that Func1 is a member function of TestClass, so QtPrivate::FunctionPointer<Func1>::Object is just TestClass. But the constructor of KisSignalAutoConnection receives a const QObject *. The problem here is that connect() is expecting a const TestClass *, but we give them a const QObject *. A base class pointer cannot be implicitly converted to a derived class pointer, so we have that error.

The resolution seems pretty simple, as we only need to include the types of sender and receiver into the template, and pass everything as-is to QObject::connect():

1
2
3
template<class Sender, class Signal, class Receiver, class Method>
inline KisSignalAutoConnection(Sender sender, Signal signal, Receiver receiver, Method method,
Qt::ConnectionType type = Qt::AutoConnection);

Sounds viable. But how can we store the four parameters? It might be intuitive to make another base class, say, KisSignalAutoConnectionBase(), and make KisSignalAutoConnection a template class, so we can store sender, receiver, etc.

But wait, isn’t this just too complex? First of all, we do not have any overridden functions except for the destructor. What is more, we do not seem to have any valuable things in that base class – it would be an empty class. The use of inheritance here is ugly and useless.

And, we do not need to store the four parameters at all. QObject::connect() returns a QMetaObject::Connection, which can be used later to disconnect() it. So instead of the parameters passed to connect(), we just store the Connection object. And that is not part of the template:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
public:
template<class Sender, class Signal, class Receiver, class Method>
inline KisSignalAutoConnection(Sender sender, Signal signal, Receiver receiver, Method method,
Qt::ConnectionType type = Qt::AutoConnection)
: m_connection(QObject::connect(sender, signal, receiver, method, type))
{
}

inline ~KisSignalAutoConnection()
{
QObject::disconnect(m_connection);
}

private:
QMetaObject::Connection m_connection;

And with the test code mentioned above, we do make sure that the new implementation works well with both syntaxes.

So, great, krita developers, we can use the new syntax for auto connections as well.

PS: There will soon be another post on my work of the snapshot docker – it’s almost finished!

Categories: FLOSS Project Planets

heykarthikwithu: Run Simple test cases from terminal via Docker commands

Planet Drupal - Mon, 2019-06-17 04:48
Run Simple test cases from terminal via Docker commands

Run Simple test cases from terminal via Docker commands

heykarthikwithu Monday, 17 June 2019 - 14:18:28 IST
Categories: FLOSS Project Planets

Stanislas Morbieu: AI Paris 2019 in one picture

Planet Python - Mon, 2019-06-17 04:00

Last week, I was at the AI Paris 2019 event to represent Kernix. We had great talks with so many people, and I barely had time to go around to look what other companies were working on. That is why I look at this afterwards. Can we have a big picture of the event without being there?

It happens that the list of the companies is on the website of the AI Paris 2019. With a little bit of Python code we can retrieve it. The following Python code retrieves the html content of the page we are interested in:

import urllib url = "https://aiparis.fr/2019/sponsors-et-exposants/" with urllib.request.urlopen(url) as response: html = response.read().decode('utf-8')

Note that response.read() returns bytes, so we have to decode it to get a string. The encoding used is utf-8, which can be seen in the charset attribute of the meta tag (and fortunately it is the correct encoding, which is not always the case). Looking at the html, we can notice that the data is in a javascript variable. Let's get the line where it is defined:

import re line = re.findall(".*Kernix.*", html)[0]

By default, the regex does not span multiple lines. The line variable is almost a json. We can get the data as a Python dictionary with the following:

import json i = line.index('[') d = '{"all":' + line[i:-1] + '}' data = json.loads(d)

Let's now get the descriptions of the companies concatenated into a single string:

import re def clean_html(html): regex = re.compile('<.*?>') text = re.sub(regex, '', html) return text corpus = [clean_html(e['description']) for e in data['all']] text = " ".join(corpus)

The clean_html function is created since the descriptions contain html code and removing the tags make it more readable. A way to go when dealing with text is to remove "stopwords", i.e. words that do not contain meaningfull information:

from nltk.corpus import stopwords stop_words = (set(stopwords.words('french')) | {'le', 'leur', 'afin', 'les', 'leurs'} | set(stopwords.words('english')))

NLTK library provides lists of stopwords. We use both the english and french lists since the content is mosly written in french, but some descriptions are in english. I added some french stopwords.

I am not a big fan of word clouds since we can only spot quickly a few words, but for this use case, we can get a pretty good big picture in very few lines:

Here is the code to generate it:

import random import matplotlib.pyplot as plt from wordcloud import WordCloud def grey_color_func(word, font_size, position, orientation, random_state=None, **kwargs): return "hsl(0, 0%%, %d%%)" % random.randint(60, 100) wordcloud = WordCloud(width=1600, height=800, background_color="black", stopwords=stop_words, random_state=1, color_func=grey_color_func).generate(text) fig = plt.figure(figsize=(20, 10)) plt.imshow(wordcloud, interpolation="bilinear") plt.axis("off") plt.tight_layout(pad=0) fig.savefig('wordcloud.png', dpi=fig.dpi) plt.show()

The grey_color_func function enable a greyscale picture. A fixed random state enables reproductibility. Setting both the width, height and the figsize enable a high quality output. The margins are removed with the two lines:

plt.axis("off") plt.tight_layout(pad=0)

Let's get back to the word cloud. We can spot immediately the words "solution" (it is the same in french and english), "data" (both in french "donnée" and english "data") and "company" ("entreprise" in french). The expressions "AI" ("IA" in french) and "artificial intelligence" ("intelligence artificielle") come only later which is quite surprising for an event focusing on artificial intelligence as a whole. In fact, AI is often seen as a solution to use data in an useful way.

Many actors are working on how to store data, manage and process it (which is very important), but only a few, from what I saw and heard, focus on creating value for people. I am always delighted to create something useful for people in they day-to-day life, whether it is a recommender system to discover things we never thought exist, or automate boring tasks, improve research in the pharmaceutical sector, and so many others. AI is about people!

Categories: FLOSS Project Planets

Mike Driscoll: PyDev of the Week: Meredydd Luff

Planet Python - Mon, 2019-06-17 01:05

This week we welcome Meredydd Luff (@meredydd) as our PyDev of the Week! Meredydd is the co-founder of Anvil and a core developer for the Skulpt package. You can learn more about Meredydd on his website. Let’s take a few moments to get to know him better!

Can you tell us a little about yourself (hobbies, education, etc):

I’ve loved programming since I was first introduced to BASIC at the age of 7. I come from Cambridge (the old one in the UK, not the relatively-new one near Boston), and I studied here too. I actually started out as a biologist, but then switched to computer science for my PhD.

I think programming is the closest thing to magic we have, and I love watching and helping people get their hands on this power. My PhD research was about building usable parallel programming systems, and now I work on Anvil, a tool to make web programming faster and easier for everyone (with Python!).

When I’m not programming, I fly light aeroplanes, which I guess is what happens when your inner six-year-old makes your life decisions. I used to dance competitively (including a few years on England’s top Latin formation team), but it turns out international competitions and startups don’t play well together, so the startup won.

Why did you start using Python?

I’d dabbled in Python a bit, but I only really started using it in earnest when we started creating Anvil. We wanted to make web development easier, by replacing the mess of five(!) different programming languages with one language and a sensible visual designer. Python was the obvious choice – it’s accessible, it’s predictable, and it has a huge and powerful ecosystem.

What other programming languages do you know and which is your favorite?

I’m a big fan of Clojure. It’s sort of the diametrical opposite of Python. Python is simple, concrete and predictable – it’s really a programming language designed for people. By contrast, Lisps like Clojure turn the abstraction up to 11, and make the person program like the compiler thinks.

I also have to tip my hat to C – if I’m using C, I must be having an adventure close to the hardware

What projects are you working on now?

These days I spend all my time on Anvil, a platform for building full-stack web apps with nothing but Python. There’s a drag-and-drop designer for your UIs, we run your client-side Python in the browser, and your server-side Python runs in our own serverless environment. We even have a Python-native database you can use.

So, whereas previously you’d need to learn HTML+CSS+JS+Python+SQL (plus all the frameworks, AWS, etc), now anyone who can write Python can build and deploy a web application with Anvil.

Which Python libraries are your favorite (core or 3rd party)?

It’s tough, but I’d have to choose Skulpt, the Python-to-Javascript compiler. We’d used before in an educational context, but we use it really heavily in Anvil. Obviously Skulpt is how we run client-side Python in the browser, but we use it in other ways too – for example, we use Skulpt’s parser to drive our Python code completion! (I talked briefly about how our Python autocompleter works at PyCon UK.)

I’m one of the core maintainers these days – I’m currently working on a tear-down-and-rebuild of the front end, which is great fun for compiler nerds. If you want to join in, please drop us a line on GitHub!

Where did the idea behind Skulpt come from?

I can’t claim credit for Skulpt’s existence – the project was started by Scott Graham, and these days there’s a whole team of us. The original impetus was around education: When you’re first learning to code, setting up a Python environment is a big hassle, and so having a playground “just there” in the browser is a massive win. I suppose Anvil is one step further – we put a full-strength application development and deployment platform “just there” in your browser.

Can you tell us the story behind Anvil?

My cofounder Ian and I both learned to program using Visual Basic and similar tools. The 90s were a sort of golden age for that: Anyone who could use one (fairly simple) programming language could build apps that looked and worked like everything else on their desktop.

These days, everything is on the web, but the barrier to entry is huge: you need to learn all these languages and frameworks, plus Linux system administration, just to build your first app. It’s exhausting – and it cuts off so much opportunity for people like data scientists or electronic engineers, who have a job to do and don’t have time to learn all that stuff. Eventually, Ian and I got fed up of moaning about the situation, and decided to build something to fix it!

Anvil’s goal is to make web programming usable by everyone, but still powerful enough for seasoned professionals. We cut out the incidental complexity, but we keep the powerful programming language and the huge ecosystem.

Is there anything else you’d like to say?

Oh, yes – Use autocomplete!

Thanks for doing the interview, Meredydd!

The post PyDev of the Week: Meredydd Luff appeared first on The Mouse Vs. The Python.

Categories: FLOSS Project Planets

Kushal Das: DMARC, mailing list, yahoo and gmail

Planet Python - Mon, 2019-06-17 01:02

Last Friday late night, I suddenly started getting a lot of bounced emails from the dgplug mailing list. Within a few minutes, I received more than a thousand emails. A quick look inside of the bounce emails showed the following error:

Unauthenticated email from yahoo.in is not accepted due to domain's 550-5.7.1 DMARC policy.

Gmail was blocking one person’s email via our list (he sent that using Yahoo and from his iPhone client), and caused more than 1700 gmail users in our list in the nomail block unless they check for the mailman’s email and click to reenable their membership.

I panicked for a couple of minutes and then started manually clicking on the mailman2 UI for each user to unblock them. However, that was too many clicks. Suddenly I remembered the suggestion from Saptak about using JavaScript to do this kind of work. Even though I tried to learn JavaScript 4 times and failed happily, I thought a bit searching on Duckduckgo and search/replace within example code can help me out.

$checkboxes = document.querySelectorAll("[name$=nomail]"); for (var i=0; i<$checkboxes.length; i++) { $checkboxes[i].checked = false; }

The above small script helped me to uncheck 50 email addresses at a time, and I managed to unblock the email addresses without spending too many hours clicking.

I have also modified the mailing list DMARC settings as suggested. Now, have to wait and see if this happens again.

Categories: FLOSS Project Planets

Podcast.__init__: Algorithmic Trading In Python Using Open Tools And Open Data

Planet Python - Sun, 2019-06-16 22:04
Algorithmic trading is a field that has grown in recent years due to the availability of cheap computing and platforms that grant access to historical financial data. QuantConnect is a business that has focused on community engagement and open data access to grant opportunities for learning and growth to their users. In this episode CEO Jared Broad and senior engineer Alex Catarino explain how they have built an open source engine for testing and running algorithmic trading strategies in multiple languages, the challenges of collecting and serving currrent and historical financial data, and how they provide training and opportunity to their community members. If you are curious about the financial industry and want to try it out for yourself then be sure to listen to this episode and experiment with the QuantConnect platform for free.Summary

Algorithmic trading is a field that has grown in recent years due to the availability of cheap computing and platforms that grant access to historical financial data. QuantConnect is a business that has focused on community engagement and open data access to grant opportunities for learning and growth to their users. In this episode CEO Jared Broad and senior engineer Alex Catarino explain how they have built an open source engine for testing and running algorithmic trading strategies in multiple languages, the challenges of collecting and serving currrent and historical financial data, and how they provide training and opportunity to their community members. If you are curious about the financial industry and want to try it out for yourself then be sure to listen to this episode and experiment with the QuantConnect platform for free.

Announcements
  • Hello and welcome to Podcast.__init__, the podcast about Python and the people who make it great.
  • When you’re ready to launch your next app or want to try a project you hear about on the show, you’ll need somewhere to deploy it, so take a look at our friends over at Linode. With 200 Gbit/s private networking, scalable shared block storage, node balancers, and a 40 Gbit/s public network, all controlled by a brand new API you’ve got everything you need to scale up. And for your tasks that need fast computation, such as training machine learning models, they just launched dedicated CPU instances. Go to pythonpodcast.com/linode to get a $20 credit and launch a new server in under a minute. And don’t forget to thank them for their continued support of this show!
  • And to keep track of how your team is progressing on building new features and squashing bugs, you need a project management system designed by software engineers, for software engineers. Clubhouse lets you craft a workflow that fits your style, including per-team tasks, cross-project epics, a large suite of pre-built integrations, and a simple API for crafting your own. With such an intuitive tool it’s easy to make sure that everyone in the business is on the same page. Podcast.init listeners get 2 months free on any plan by going to pythonpodcast.com/clubhouse today and signing up for a trial.
  • You listen to this show to learn and stay up to date with the ways that Python is being used, including the latest in machine learning and data analysis. For even more opportunities to meet, listen, and learn from your peers you don’t want to miss out on this year’s conference season. We have partnered with organizations such as O’Reilly Media, Dataversity, and the Open Data Science Conference. Coming up this fall is the combined events of Graphorum and the Data Architecture Summit. The agendas have been announced and super early bird registration for up to $300 off is available until July 26th, with early bird pricing for up to $200 off through August 30th. Use the code BNLLC to get an additional 10% off any pass when you register. Go to pythonpodcast.com/conferences to learn more and take advantage of our partner discounts when you register.
  • The Python Software Foundation is the lifeblood of the community, supporting all of us who want to run workshops and conferences, run development sprints or meetups, and ensuring that PyCon is a success every year. They have extended the deadline for their 2019 fundraiser until June 30th and they need help to make sure they reach their goal. Go to pythonpodcast.com/psf today to make a donation. If you’re listening to this after June 30th of 2019 then consider making a donation anyway!
  • Visit the site to subscribe to the show, sign up for the newsletter, and read the show notes. And if you have any questions, comments, or suggestions I would love to hear them. You can reach me on Twitter at @Podcast__init__ or email hosts@podcastinit.com)
  • To help other people find the show please leave a review on iTunes and tell your friends and co-workers
  • Join the community in the new Zulip chat workspace at pythonpodcast.com/chat
  • Your host as usual is Tobias Macey and today I’m interviewing Jared Broad and Alex Catarino about QuantConnect, a platform for building and testing algorithmic trading strategies on open data and cloud resources
Interview
  • Introductions
  • How did you get introduced to Python?
  • Can you start by explaining what QuantConnect is and how the business got started?
  • What is your mission for the company?
  • I know that there are a few other entrants in this market. Can you briefly outline how you compare to the other platforms and maybe characterize the state of the industry?
  • What are the main ways that you and your customers use Python?
  • For someone who is new to the space can you talk through what is involved in writing and testing a trading algorithm?
  • Can you talk through how QuantConnect itself is architected and some of the products and components that comprise your overall platform?
  • I noticed that your trading engine is open source. What was your motivation for making that freely available and how has it influenced your design and development of the project?
  • I know that the core product is built in C# and offers a bridge to Python. Can you talk through how that is implemented?
    • How do you address latency and performance when bridging those two runtimes given the time sensitivity of the problem domain?
  • What are the benefits of using Python for algorithmic trading and what are its shortcomings?
    • How useful and practical are machine learning techniques in this domain?
  • Can you also talk through what Alpha Streams is, including what makes it unique and how it benefits the users of your platform?
  • I appreciate the work that you are doing to foster a community around your platform. What are your strategies for building and supporting that interaction and how does it play into your product design?
  • What are the categories of users who tend to join and engage with your community?
  • What are some of the most interesting, innovative, or unexpected tactics that you have seen your users employ?
  • For someone who is interested in getting started on QuantConnect what is the onboarding process like?
    • What are some resources that you would recommend for someone who is interested in digging deeper into this domain?
  • What are the trends in quantitative finance and algorithmic trading that you find most exciting and most concerning?
  • What do you have planned for the future of QuantConnect?
Keep In Touch Picks Links

The intro and outro music is from Requiem for a Fish The Freak Fandango Orchestra / CC BY-SA

Categories: FLOSS Project Planets

Manuel A. Fernandez Montecelo: Debian GNU/Linux riscv64 port in mid 2019

Planet Debian - Sun, 2019-06-16 22:00

It's been a while since last post (Talk about the Debian GNU/Linux riscv64 port at RISC-V workshop), and sometimes things look very quiet from outside even if the people on the backstage never stop working. So this is an update on the status of this port before the release of buster, which should happen in a few weeks and which it will open the way for more changes that will benefit the port.

The Big Picture

First, the big picture(s):

Debian-Ports All-time Graph, 2019-06-17

As it can be seen in the first graph, perhaps with some difficulty, is that the percent of arch-dependent packages built for riscv64 (grey line) has been around or higher than 80% since mid 2018, just a few months after the port was added to the infrastructure.

Given than the arch-dependent packages are about half of the Debian['s main, unstable] archive and that (in simple terms) arch-independent packages can be used by all ports (provided that the software that they rely on is present, e.g. a programming language interpreter), this means that around 90% of packages of the whole archive has been available for this architecture from early on.

Debian-Ports Quarter Graph, 2019-06-17

The second graph shows that the percentages are quite stable (for all architectures, really: the peaks and dips in the graph only represent <5% of the total). This is in part due to the freeze for buster, but it usually happens at other times as well (except in the initial bring-up or in the face of severe problems), and it really shows that even the second-class ports are in quite good health in broad terms.


Note: These graphs are for architectures in the debian-ports infrastructure (which hosts architectures not as well supported as the main ones, the ones present in stable releases). The graphs are taken from the buildd stats page, which also includes the main supported architectures.

A little big Thank You

Together, both graphs are also testament that there are people working on ports at all times, keeping things working behind the scenes, and that's why from a high level view it seems that things “just work”.

More in general, aside from the work of porters themselves, there are also people working on bootstrapping issues that make bringing up ports easier than in the past, or coping better when toolchain support or other issues deal an important blow to some ports. And, of course, all other contributors of Debian help by keeping good tools and building rules that work across architectures, patching the upstream software for needs of several architectures at the same time (endianness, width of basic types), many upstream projects are generic enough that they don't need specific porting, etc.

Thanks to all of you!

Next Steps Installation on hardware, VMs, etc.

Due to several reasons, among them the limited availability of hardware able to run this Debian port and the limited options to use bootloaders during all this time, the instructions to get Debian running on RISC-V are not the best, easiest, more elegant or very up to date. This is an area to improve in the next months.

Meanwhile, there's a Debian RISC-V's wiki page with instructions to get a chroot working in a HiFive Unleashed board as shipped, without destroying the initial factory set-up.

Specially Vagrant Cascadian and Karsten Merker have been working on the area of booting the system, and there are instructions to set-up a riscv64 Qemu VM and boot it with u-boot and opensbi. Karsten is also working to get support in debian-installer, the main/canonical way to install Debian systems (perhaps less canonical nowadays with the use of OS images, but still hugely important).

Additionally, it would be nice to have images publicly available and ready to use, for both Qemu and hardware available like the HiFive Unleashed (or others that might show up in time), but although there's been some progress on that, it's still not ready and available for end users.

The last 10%+ of the archive

So, what's the remaining work left to have almost all of the archive built for this architecture? What's left to port, as such?

The main blockers to get closer to 100% of packages built are basically LLVM and Rust (which, in turn, depends on LLVM).

Currently there are more than 500 packages from the Rust ecosystem in the archive (which is about 4%), and they cannot be built and used until Rust has support for the architecture. And Rust needs LLVM, there's no Rust compiler based on GCC or other toolchains (as it's the case of Go, for example, in which there's a gcc-go compiler in addition to their own golang-go), so this is the only alternative.

Firefox is the main high-level package that depends on Rust, but many packages also depend on librsvg2 to render SVG images, and this library has been converted to Rust. We're still using the C version for that, but it cannot be sustained in the long term.

Aside from Rust, other packages directly depend or use LLVM to some extent, and this is not fully working for riscv64 at the moment, but it is expected that during 2019 the support of LLVM for riscv64 will be completed.

There are other programming language ecosystems that need attention, but they represent a really low percentage (only dozens of packages, of more than 12 thousand; and with no dependencies outside that set). And then, of course, there is long tail of packages that cannot be built due to a missing dependency, lack of support for the architecture or random failures -- together they make a substantial number of the total, but they need to be looked at and solved almost on a case-by-case basis.

Finally, when the gates of the unstable suite open again after the freeze for the stable release of buster, we will see tools with better support and patches can be accepted again to support riscv64, so we can hope that things will improve at a faster rate soon :-)

Categories: FLOSS Project Planets

Wingware Blog: Extending Wing with Python (Part Two)

Planet Python - Sun, 2019-06-16 21:00

In this issue of Wing Tips we continue to look at how to extend Wing's functionality, by setting up a project that can be used to develop and debug extension scripts written for (and with) Wing, just as you would work with any other Python code.

Creating a Scripting Project

To debug extension scripts written for Wing, you will need to set up a new project that is configured so that Wing can debug itself. The manual steps for doing this are documented in Debugging Extension Scripts. However, let's use an extension script to do this automatically.

Copy the following Python code and paste it into a new file in Wing, using the File > New menu item:

import sys import os import collections import wingapi def new_scripting_project(): """Create a new Wing project that is set up for developing and debugging scripts using the current instance of Wing""" app = wingapi.gApplication def setup_project(): # Set the Python Executable to the Python that runs Wing proj = app.GetProject() proj.SetPythonExecutable(None, sys.executable) if not __debug__: from proj.attribs import kPyRunArgs app.fSingletons.fFileAttribMgr.SetValue(kPyRunArgs, None, ('custom', '-u -O')) # Add Wing's installation directory to the project proj.AddDirectory(app.GetWingHome()) # Set main debug file to Wing's entry point wingpy = os.path.join(app.GetWingHome(), 'bootstrap', 'wing.py') proj.SetMainDebugFile(wingpy) # Setup the environment for auto-completion on the API and some additional # values needed only on macOS; use OrderedDict because later envs depend # on earlier ones env = collections.OrderedDict() env['PYTHONPATH'] = os.path.join(app.GetWingHome(), 'src') if sys.platform == 'darwin': runtime_dir = os.path.join(app.GetWingHome(), 'bin', '__os__', 'osx') files = os.listdir(runtime_dir) for fn in files: if fn.startswith('runtime-qt'): qt_ver = fn[len('runtime-'):] break env.update(collections.OrderedDict([ ('RUNTIMES', runtime_dir), ('QTVERSION', qt_ver), ('QTRUNTIME', '${RUNTIMES}/runtime-${QTVERSION}'), ('SCIRUNTIME', '${RUNTIMES}/runtime-scintillaedit-${QTVERSION}'), ('DYLD_LIBRARY_PATH', '${QTRUNTIME}/lib:${SCIRUNTIME}/lib'), ('DYLD_FRAMEWORK_PATH', '${DYLD_LIBRARY_PATH}'), ])) app.GetProject().SetEnvironment(None, 'startup', env) # Create a new project; this prompts the user to save any unsaved files first # and only calls the completion callback if they don't cancel app.NewProject(setup_project) new_scripting_project.contexts = [wingapi.kContextNewMenu('Scripts')]

Save this to a file named utils.py or any other name ending in .py in the scripts directory that is located inside Wing's Settings Directory, as listed in Wing's About box. Then select Reload All Scripts from the Edit menu to get Wing to discover the new script file. Once that's done, Wing monitors the file and will reload it automatically if you edit it and save changes to disk.

You can now create your project with New scripting project from the Scripting menu that should appear in the menu bar:

This prompts to save any edited files before closing the current project, and then creates and configures the new project.

Debugging Extension Scripts

Now you're ready to develop and debug extension scripts with Wing. Try this now by appending the following to the script file that you created in the previous section:

import wingapi def hello_world(): wingapi.gApplication.ShowMessageDialog("Hello", "Hello world!") hello_world.contexts = [wingapi.kContextNewMenu("Scripts")]

Save this to disk and then set a breakpoint on the third line, on the call to ShowMessageDialog.

Next, uncheck the Projects > Auto-reopen Last Project preference so that the debugged copy of Wing opens the default project and not your already-open scripting project:

Then select Start/Continue from the Debug menu to start up a copy of Wing running under its own debugger. This copy of Wing should also contain a Scripts menu in its menu bar, with an item Hello world:

Select that and you will reach the breakpoint you set in the outer instance of Wing (the one you started debug from):

You will see and can navigate the entire stack, but Wing will not be able to show source files for most of its internals. If you need to see the source code of Wing itself, you will have to obtain the source code as described in Advanced Scripting.



That's it for now! In the next Wing Tip we'll look more closely at the scripting API.

Categories: FLOSS Project Planets

KBibTeX 0.9 released

Planet KDE - Sun, 2019-06-16 16:02

Finally, KBibTeX 0.9 got released. Virtually nothing has changed since the release of beta 2 in May as no specific bugs have been reported. Thus ChangeLog is still the same and the details on the changes since 0.8.2 as shown on the release announcement for 0.9-beta2.

Grab the sources at KDE&aposs download mirrors.



comments
Categories: FLOSS Project Planets

DataSmith: Layman's Guide to Image Lazy Loading in Lightning

Planet Drupal - Sun, 2019-06-16 11:08
Layman's Guide to Image Lazy Loading in Lightning Barrett Sun, 06/16/2019 - 10:08
Categories: FLOSS Project Planets

International number formats

Planet KDE - Sun, 2019-06-16 11:01

KMyMoney as a financial application deals with numbers a lot. As a KDE application, it supports internationalization (or i18n for short) from the very beginning. For accuracy reasons it has internal checks to verify the numbers a user can enter.

The validation routine has a long history (I think it goes back to the KDE3 days) and we recently streamlined it a bit as part of the journey to use more and more Qt standard widgets instead of our own.

This led to the replacement of the KMyMoneyEdit widget with the newer AmountEdit widget. Everything worked great for me (using a German locale) until we received notifications that users could only enter integer numbers but no fractional part. This of course is not what we wanted. But why is that?

The important piece of information was that the user reporting the issue uses the Finland svenska (sv_FI) locale on his system. So I set my development system to use that locale for numbers and currencies and it failed for me as well. So it was pretty clear that the validation logic had a flaw.

Checking the AmountValidator object which is an extension of the QDoubleValidator I found out that it did not work as expected with the said locale. So it was time to setup some testcases for the validator to see how it performs with other locales. I still saw it failing which made me curious so I dug into the Qt source code one more time, specifically the QDoubleValidator. Well, it looked that most of the logic we added in former times is superfluous meanwhile with the Qt5 version. But there remains a little difference: the QDoubleValidator works on the symbols of the LC_NUMERIC category of a locale where we want to use it the LC_MONETARY version. So what to do? Simply ignore the fact? This could bite us later.

So the next step was to check what happens within the testcases: Using the current code base they fail. Ok, good news I caught the problem. Using the plain QDoubleValidator and running the testcases again showed that all were positive. So no problem? What if a locale uses different symbols for the decimal point and the thousands separator? Does such a locale exist? Looking at all of them seemed a bit tedious, but tedious work is something for a computer. Here’s the little script I wrote (for simplicity in Perl):

checklocalesDownload

So, they do exist. Bummer, we need to do something about it. On my box, I collect 484 locales where 110 show differences. That’s close to 25% of the locales, too much to simply ignore the problem. Since some of the locales where present in multiple formats (e.g. br_FR, br_FR@euro and br_FR.utf8) I concentrated on the utf8 versions and the ones that contained an at-sign in their name). This modified the figures to 37 with differing values out of 182 locales checked. Still 20%. Here’s that list. <> denotes an emtpy string, and _ denotes a blank in the locale. If I would not replace those values one could not see a difference in a terminal session.

Locale dp mon_dp sep mon_sep aa_DJ.utf8 . . <> _ aa_ER@saaho . . <> , be_BY@latin , . . _ be_BY.utf8 , . . _ bg_BG.utf8 , , <>   bs_BA.utf8 , , <> _ ca_AD.utf8 , , <> . ca_ES@euro , , <> . ca_ES.utf8 , , <> . ca_FR.utf8 , , <> . ca_IT.utf8 , , <> . de_AT@euro , , . _ de_AT.utf8 , , . _ es_MX.utf8 . .   , es_PE.utf8 , . . , eu_ES@euro , , <> . eu_ES.utf8 , , <> . gez_ER@abegede . . <> , gl_ES@euro , , <> . gl_ES.utf8 , , <> . hr_HR.utf8 , , <> _ it_CH.utf8 , . ' ' it_IT@euro , , <> . it_IT.utf8 , , <> . mg_MG.utf8 , , <> _ nl_BE.utf8 , , . _ nl_NL@euro , , <> _ nl_NL.utf8 , , <> _ pl_PL.utf8 , , <> . pt_PT@euro , , <> . pt_PT.utf8 , , <> . ru_RU.utf8 , .     ru_UA.utf8 , . . _ so_DJ.utf8 . . <> _ sr_RS@latin , , <> . tg_TJ.utf8 , . . _ tt_RU@iqtelif , . .   182 locales processed, 37 locales differ.

Next I went in and tried to figure out what differences exist. A bit of UNIX foo (i.e. combining the tools of your toolbox) shows the results:

./checklocales | tr '\t' ':' | cut -d: -f2- | tr ':' '\t' | sort | uniq | grep -v "mon_dp" | grep -v "process"

reveals the following result (I added the header here manually):

dp mon_dp sep mon_sep , , <> _ , , <> . , , . _ , . . _ , . . , , . .   , . ' ' . . <> _ . . <> , . .   , , , <>   , .    

Running this additionally through “wc -l” shows 12 different combinations. The last one is strange, because it does not show anything in the last two columns. Looking at the full table shows that this is the entry produced by ru_RU.utf8. Why is this now? Running

LC_ALL=ru_RU.utf8 locale -k decimal_point mon_decimal_point thousands_sep mon_thousands_sep

we get

decimal_point="," mon_decimal_point="." thousands_sep=" " mon_thousands_sep=" "

which looks just fine, but why did we not see the underbar for the blanks in the thousands separator? Running the above output through our friend od we see why:

LC_ALL=ru_RU.utf8 locale -k decimal_point mon_decimal_point thousands_sep mon_thousands_sep | od -c 0000000 d e c i m a l _ p o i n t = " , 0000020 " \n m o n _ d e c i m a l _ p o 0000040 i n t = " . " \n t h o u s a n d 0000060 s _ s e p = " 302 240 " \n m o n _ t 0000100 h o u s a n d s _ s e p = " 302 240 0000120 " \n 0000122

Ah, “302 240” octal: that is some UTF-8 encoded character presented as a blank in my default locale. Ok, since it is the same no big deal. But there are two more:

, . .   . .   ,

Going into the long list we see that these are the results of tt_RU@iqtelif and es_MX.utf8. Looking at od’s output we see:

LC_ALL=tt_RU@iqtelif locale -k decimal_point mon_decimal_point thousands_sep mon_thousands_sep | od -c 0000000 d e c i m a l _ p o i n t = " , 0000020 " \n m o n _ d e c i m a l _ p o 0000040 i n t = " . " \n t h o u s a n d 0000060 s _ s e p = " . " \n m o n _ t h 0000100 o u s a n d s _ s e p = " 342 200 202 0000120 " \n 0000122 LC_ALL=es_MX.utf8 locale -k decimal_point mon_decimal_point thousands_sep mon_thousands_sep | od -c 0000000 d e c i m a l _ p o i n t = " . 0000020 " \n m o n _ d e c i m a l _ p o 0000040 i n t = " . " \n t h o u s a n d 0000060 s _ s e p = " 342 200 211 " \n m o n _ 0000100 t h o u s a n d s _ s e p = " , 0000120 " \n 0000122

so again just some special chars which are shown as blank, but they differ between numeric and currency representation in the said locales.

While trying a few things I stumbled across the following output. See for yourself:

LC_ALL=C.utf8 locale -k decimal_point mon_decimal_point thousands_sep mon_thousands_sep decimal_point="." mon_decimal_point="." thousands_sep="" mon_thousands_sep="" LC_ALL=C locale -k decimal_point mon_decimal_point thousands_sep mon_thousands_sep decimal_point="." mon_decimal_point="" thousands_sep="" mon_thousands_sep=""

What? The C locale (non UTF-8 version) has no monetary decimal point defined? This may end in strange results, but since nowadays we talk about UTF-8 I simply put it aside.

Back to the original problem: the AmountValidator. Looking at the results above, it seems not to make sense to simply use the QDoubleValidator version. Would it make sense to allow both versions? To use the QDoubleValidator we would need to replace the currency versions of the two characters with their numeric version. The question is: would that lead to false strings in any of the locales? Looking at the long list, we find the following where both characters are different:

Locale dp mon_dp sep mon_sep be_BY@latin , . . _ be_BY.utf8 , . . _ es_PE.utf8 , . . , ru_UA.utf8 , . . _ tg_TJ.utf8 , . . _ tt_RU@iqtelif , . .  

es_PE.utf8 seems to be the worst candidate. The meaning is simply reversed. Let’s add this locale to the testcases and see how it performs.

Looks like it works out of the box. Strange, I expected a different result. Anyway, the changes to KMyMoney are now committed to the master branch and I can continue with the next problem.

Categories: FLOSS Project Planets

Catalin George Festila: Python 3.7.3 : Using the pycryptodome python module.

Planet Python - Sun, 2019-06-16 01:54
This python module can be used with python 3. More information can be found here. PyCryptodome is a self-contained Python package of low-level cryptographic primitives. It supports Python 2.6 and 2.7, Python 3.4 and newer, and PyPy. The install of this python module is easy with pip tool: C:\Python373\Scripts>pip install pycryptodome Collecting pycryptodome ... Installing collected packages:
Categories: FLOSS Project Planets

Russ Allbery: Review: Abaddon's Gate

Planet Debian - Sun, 2019-06-16 01:17

Review: Abaddon's Gate, by James S.A. Corey

Series: The Expanse #3 Publisher: Orbit Copyright: 2013 ISBN: 0-316-23542-3 Format: Kindle Pages: 540

Abaddon's Gate is the third book in the Expanse series, following Caliban's War. This series tells a single long story, so it's hard to discuss without spoilers for earlier books although I'll try. It's a bad series to read out of order.

Once again, solar system politics are riled by an alien artifact set up at the end of the previous book. Once again, we see the fallout through the eyes of multiple viewpoint characters. And, once again, one of them is James Holden, who starts the book trying to get out of the blast radius of the plot but is pulled back into the center of events. But more on that in a moment.

The other three viewpoint characters are, unfortunately, not as strong as the rest of the cast in Caliban's War. Bull is the competent hard-ass whose good advice is repeatedly ignored. Anna is a more interesting character, a Methodist reverend who reluctantly leaves her wife and small child to join an interfaith delegation (part of a larger delegation of artists and philosophers, done mostly as a political stunt) to the alien artifact at the center of this book. Anna doesn't change that much over the course of the book, but her determined, thoughtful kindness and intentional hopefulness was appealing to read about. She also has surprisingly excellent taste in rich socialite friends.

The most interesting character in the book is the woman originally introduced as Melba. Her obsessive quest for revenge drives much of the plot, mostly via her doing awful things but for reasons that come from such a profound internal brokenness, and with so much resulting guilt, that it's hard not to eventually feel some sympathy. She's also the subject of the most effective and well-written scene in the book: a quiet moment of her alone in a weightless cell, trying to position herself in its exact center. (Why this is so effective is a significant spoiler, but it works incredibly well in context.)

Melba's goal in life is to destroy James Holden and everything he holds dear. This is for entirely the wrong reasons, but I had a hard time not feeling a little bit sympathetic to that too.

I had two major problems with Abaddon's Gate. The first of them is that this book (and, I'm increasingly starting to feel, this series) is about humans doing stupid, greedy, and self-serving things in the face of alien mystery, with predictably dire consequences. This is, to be clear, not in the slightest bit unrealistic. Messy humans being messy in the face of scientific wonder (and terror), making tons of mistakes, but then somehow muddling through is very in character for our species. But realistic doesn't necessarily mean entertaining.

A lot of people die or get seriously injured in this book, and most of that is the unpredictable but unsurprising results of humans being petty assholes in the face of unknown dangers instead of taking their time and being thoughtful and careful. The somewhat grim reputation of this series comes from being relatively unflinching about showing the results of that stupidity. Bad decisions plus forces that do not care in the slightest about human life equals mass casualties. The problem, at least for me personally, is this is not fun to read about. If I wanted to see more of incompetent people deciding not to listen to advice or take the time to understand a problem, making impetuous decisions that make them feel good, and then turning everything to shit, I could just read the news. Bull as a viewpoint character doesn't help, since he's smart enough to see the consequences coming but can't stop them. Anna is the one character who manages to reverse some of the consequences by being a better person than everyone else, and that partly salvages the story, but there wasn't enough of that.

The other problem is James Holden. I was already starting to get annoyed with his self-centered whininess in Caliban's War, but in Abaddon's Gate it turns into eye-roll-inducing egomania. Holden seems convinced that everything that happens is somehow about him personally, and my tolerance for self-centered narcissists is, shall we say, at a historically low ebb. There's a point late in this book when Holden decides to be a sexist ass to Naomi (I will never understand what that woman sees in him), and I realized I was just done. Done with people pointing out to Holden that he's just a wee bit self-centered, done with him going "huh, yeah, I guess I am" and then making zero effort to change his behavior, done with him being the center of the world-building for no good reason, done with plot armor and the clear favor of the authors protecting him from consequences and surrounding him with loyalty he totally doesn't deserve, done with his supposed charisma which is all tell and no show. Just done. At this point, I actively loathe the man.

The world-building here is legitimately interesting, if a bit cliched. I do want to know where the authors are going with their progression of alien artifacts, what else humanity might make contact with, and what the rest of the universe looks like. I also would love to read more about Avasarala, who sadly didn't appear in this book but is the best character in this series so far. I liked Anna, I ended up surprising myself and liking Melba (or at least the character she becomes), and I like most of Holden's crew. But I may be done with the series here because I'm not sure I can take any more of Holden. I haven't felt this intense of dislike for a main series character since I finally gave up on The Wheel of Time.

Abaddon's Gate has a lot of combat, a lot of dead people, and a lot of gruesome injury, all of which is belabored enough that it feels a bit padded, but it does deliver on what it promises: old-school interplanetary spaceship fiction with political factions, alien artifacts, some mildly interesting world-building, and, in Melba, some worthwhile questions about what happens after you've done something unforgivable. It doesn't have Avasarala, and therefore is inherently far inferior to Caliban's War, but if you liked the previous books in the series, it's more of that sort of thing. If Holden has been bothering you, though, that gets much worse.

Followed by Cibola Burn.

Rating: 6 out of 10

Categories: FLOSS Project Planets

KDE Usability & Productivity: Week 75

Planet KDE - Sun, 2019-06-16 01:01

Week 75 in KDE’s Usability & Productivity initiative is here! It’s a little lighter than usual because we’re all feverishly preparing for the dual Plasma and Usability & Productivity sprints nest week in Valencia, Spain. I’ll be there, as well as the whole Plasma team, a bunch of VDG people, and a number of folks who have stepped up to work on apps for the U&P initiative. Sprints like these promise the kind of face-to-face contact needed for big projects, and should be a fantastically awesome and productive time! I’d like to offer a special thanks to Slimbook (makers of the KDE Slimbook II laptop) for hosting the sprint!

Anyway, here’s the week’s report:

New Features Bugfixes & Performance Improvements User Interface Improvements

Next week, your name could be in this list! Not sure how? Just ask! I’ve helped mentor a number of new contributors recently and I’d love to help you, too! You can also check out https://community.kde.org/Get_Involved, and find out how you can help be a part of something that really matters. You don’t have to already be a programmer. I wasn’t when I got started. Try it, you’ll like it! We don’t bite!

If you find KDE software useful, consider making a donation to the KDE e.V. foundation.

Categories: FLOSS Project Planets

The state of Terminal Emulators in Linux

Planet KDE - Sat, 2019-06-15 20:00

We are seeing a strange trend nowadays: The terminal is being more widely used in windows, after they fixed the broken cmd.exe and MS is actually actively promoting their terminal applications like Visual Studio Code and the new Windows Terminal, but on the other hand the Unix world was always terminal based, even if you tend to use the Desktop Environment for your daily needs you probably also had a Terminal open somewhere.

And because we had always been terminal users, there are a lot of terminals for Linux, But it’s not easy to keep a terminal up to date, it’s usually fragile code and hard to maintain, that means that a lot of projects, good projects, are abandoned.

Good terminals that unfortunately didn’t make it
  • Tilix: Is one of the best terminal emulators out there, it’s a shell written in D, on top of VTE, the author lacks the time to continue working on it.
  • Terminator: I don’t know if this one is abandoned as there’s no news on the web about it, the news that I know is that version 2 is scheduled for release in 2017, never happened though.

If you are sad that your favorite terminal is without a maintainer, try to maintain it, free software depends on developers.

The state of Konsole:
  • Konsole is receiving more reviews, more code and more contributors in the last year than it got in the past 10 years.
  • In the past, the project is kept alive by Kurt, and he did an awesome job. But a project as big as konsole needs more people.
  • Three new developers joined konsole and are activelly fixing things over the past year, no plans to stop.
  • Terminator / Tilix split mode is working
  • A few features to be completely equal to minix / terminator is missing, but not hard to implement

Now it has more developers and more code flowing, fixing bugs, improving the interface, increasing the number of lines of code flowing thru the codebase. We don’t plan to stop supporting konsole, and it will not depend on a single developer anymore.

We want konsole to be the swiss army knife of terminal emulators, you can already do with konsole a lot of things that are impossible in other terminals, but we want more. And we need more developers for that.

Konsole is, together with VTE, the most used terminal out there in numbers of applications that integrate the technology: Dolphin, Kate, KDevelop, Yakuake, and many other applications also use konsole, so fixing a bug in one place we are helping a lot of other applications too.

Consider joining a project, Consider sending code.

Categories: FLOSS Project Planets

Erich Schubert: Chinese Citation Factory

Planet Debian - Sat, 2019-06-15 18:02

RetractionWatch published in Feburary 2018 an article titled “A journal waited 13 months to reject a submission. Days later, it published a plagiarized version by different authors”, indicating that in the journal Multimedia Tools and Applications (MTAP) may have been manipulated in the editorial process.

Now, more than a year later, Springer apparently has retracted additional articles from the journal, as mentioned in the blog For Better Science. On the downside, Elsevier has been publishing many of these in another journal now instead…

I am currently aware of 22 retractions associated with this incident. One would have expected to see a clear pattern in the author names, but they seem to have little in common except Chinese names and affiliations, and suspicious email addresses (also, usually only one author has an email at all). It almost appears as if the names may be made up. And these retracted papers clearly contained citation spam: they cite a particular author very often, usually in a single paragraph.

The retraction notices typically include the explanation “there is evidence suggesting authorship manipulation and an attempt to subvert the peer review process”, confirming the earlier claims by Retraction Watch.

So I used the CrossRef API to get the citations from all the articles (I tried SemanticScholar first, but for some of the retracted papers it only had the self-cite of the retraction notice), and counted the citations in these papers.

Essentially, I am counting how many citations authors lost by the retractions.

Here is the “high score” with the top 5 citation losers:

Author Citations lost Cited in papers Citation share Retractions L. Zhang 385 20 60.6% 1 M. Song 68 20 10.9% 0 C. Chen 65 19 11.1% 0 X. Liu 65 19 11.0% 0 R. Zimmermann 60 18 10.8% 0

Now this is a surprisingly clear pattern. In 20 of the retracted papers, L. Zhang was cited on average 19.25 times. In these papers, also 60% of the references were co-authored by him. In one of the remaining two papers, he was an author. The next authors seem to be mostly in this list because of co-authoring with L. Zhang earlier. In fact, if we ignore all citations to papers co-authored by L. Zhang, no author receives more than 5 citations anymore.

So this very clearly suggests that L. Zhang manipulated the MTAP journal to boost his citation index. And it is quite disappointing how long it took until Springe retracted those articles! Judging by the For Better Science article, there may be even more affected papers.

Categories: FLOSS Project Planets

Codementor: HackerRank: Circular Array Rotation in Python

Planet Python - Sat, 2019-06-15 16:19
How I solved my daily recommended HackerRank challenge problem and a small rant on problem specificity
Categories: FLOSS Project Planets

Doug Hellmann: sphinxcontrib.sqltable 2.0.0

Planet Python - Sat, 2019-06-15 15:15
sphinxcontrib-sqltable is a Sphinx extension for embedding database contents in documents What’s new in 2.0.0? drop python 2 support fix documentation build fix flake8 errors add tox and travis config
Categories: FLOSS Project Planets

Pages