FLOSS Project Planets

Kdenlive – refactoring preview and news

Planet KDE - Tue, 2017-06-20 03:06


We are very happy to announce the first AppImage of the next generation Kdenlive. We have been working since the first days of 2017 to cleanup and improve the architecture of Kdenlive’s code to make it more robust and clean. This also marked a move to QML for the display of the timeline.

This first AppImage is only provided for testing purposes. It crashes a lot because many features have yet to be ported to the new code, but you can already get a glimpse of the new timeline, move clips and compositions, group items and add some effects. This first Appimage can be downloaded from the KDE servers. Just download the Appimage, make the file executable and run it. This version is not appropriate for production use and due to file format changes, will not properly open previous Kdenlive project files. We are hoping to provide reliable nightly build AppImages so that our users can follow the development and provide feedback before the final release.

Today is also our 18th Kdenlive Café, so you can meet us tonight -20th of june – at 9pm (CEST) in the #kdenlive channel to discuss the evolution and issues around Kdenlive.

I will also be presenting the progress of this Kdenlive version this summer (22nd of July) at Akademy in Almeria – Spain – so feel free to come visit the KDE Community in this great event.

Categories: FLOSS Project Planets

Shawn McKinney: 2017 Dirty Kanza Finish Line

Planet Apache - Tue, 2017-06-20 01:14

Note: This post is about my second Dirty Kanza 200 experience on June 3, 2017.

It’s broken into seven parts:

Part I – Prep / Training

Part II – Preamble

Part III – Starting Line

Part IV – Checkpoint One

Part V – Checkpoint Two

Part VI – Checkpoint Three

Part VII – Finish Line

Regroup

I went looking for Derrick but couldn’t find him.  A woman, found out later his wife…

“Are you John?” she asked.

I replied with my name and didn’t make the connection.  I’d forgotten the color of his support team and he got my name wrong so that made us even.

He caught up ten miles later, by then chasing the fast chicks.  I called out as they zoomed past, wished them well.  This is how it works.  Alliances change according to the conditions and needs from one moment to the next.

A lone rider stopped at the edge of downtown — Rick from Dewitt, Arkansas.  He was ready for takeoff.

“You headed out, how bout we team up?”  I asked matter-of-factly.  The deal was struck and then there were two.

Eventually, maybe twenty miles later, we picked up Jeremy, which made three.  It worked pretty well.  Not much small talk, but lots of operational chatter.  You’d thought we were out on military maneuvers.

  • “Rocks on left.”
  • “Mud — go right!”
  • “Off course, turning around.”
  • “Rough! Slowing!”

There were specializations.  For example, Jeremy was the scout.  His bike had fat tires and so he’d bomb the downhills, call back to us what he saw, letting us know of the dangers.  Rick did most of the navigating.  I kept watch on time, distance and set the pace.

By this time we were all suffering and made brief stops every ten miles or so.  We’d agreed that it was OK, had plenty of time, and weren’t worried.

Caught up with Derrick six miles from home.  Apparently he couldn’t keep up with the fast chicks either, but gave it the college try, and we had a merry reunion.

We rolled over the finish line somewhat past 2:00 am.

Rick and I crossing the FL

Here’s the official video feed:

https://results.chronotrack.com/athlete/index/e/29334039

And the unofficial one:

My support team was there along with a smattering of hearty locals to cheer us and offer congratulations.

Jeremy, Rick and I had a brief moment where we congratulated each other before LeLan handed over our Breakfast Club finishers patches and I overheard Rick in his southern drawl…

“I don’t care if it does say breakfast club on there.”

Next were the hugs and pictures with my pit crew and I was nearly overcome with emotion.  Felt pretty good about the finish and I don’t care if it says breakfast club on there either.

The Pit Crew, l to r, Me, Gregg, Kelly, Janice, Cheri, Kyle

Acknowledgements

In addition to my pit crew…

My wife Cindy deserves most of the credit.  She bought the bike four years ago that got me all fired up again about cycling.  Lots of times when I’m out there riding I should be home working.  Throughout this she continues to support without complaint.  Thanks baby, you’re the best, I love you.

Next, are the guys at the bike shop — Arkansas Cycle and Fitness, my support team back home in Little Rock.  They tolerate abysmal mechanical abilities, patiently listen to requirements, and teach when need be (often).  Time and again the necessary adjustments were made to correct the issues I was having with the bike.  They’ve encouraged and cheered, offered suggestions on routes, tactics, training, nutrition, hydration and everything else related to the sport of endurance cycling.

Finally, my cycling buddies — the Crackheads.  Truth be known they’re probably more trail runners than cyclists, but they’re incredible athletes, from whom I’ve learned much about training for these types of endurance events.  In the summertime, when the skeeters and chiggers get too bad for Arkansas trail running, they come out and ride which makes me happy.

The End


Categories: FLOSS Project Planets

Bryan Pendleton: Ghost Ship report released

Planet Apache - Tue, 2017-06-20 01:09

The Oakland Fire Department has released their official report on last December's Ghost Ship Fire: Origin and Cause Report: Incident # 2016-085231.

The report is long, detailed, thorough, and terribly, terribly sad.

It is vividly illustrated with many pictures, which are simultaneously fascinating and heart-breaking.

In the end, the report accepts the limits of what is known:

No witnesses to the incipient stage of the fire were located. Based on witness statements and analysis of fire patterns, an area of origin was identified in the northwest area of the ground floor of the warehouse. In support of this hypothesis, fire patterns and fire behavior were considered, including ventilation effects of door openings, and the fuel load, consisting of large amounts of non-traditional building materials. This analysis was challenged with alternate hypotheses of fire origins, away from, or communicating to an area remote from, the immediate area of fire origin. Several potential ignition sources were considered. No conclusive determination of the initial heat source or the first materials ignited was made. The fire classification is UNDETERMINED.

In their Pulitzer Prize-winning coverage of the fire, At least nine dead, many missing in Oakland warehouse fire ,the East Bay Times highlighted the eccentricities of the collective's building, details which are thoroughly corroborated by the OFD's detailed report.

Alemany had advertised on Facebook and Craigslist looking for renters seeking "immediate change and loving revolution," who enjoyed "poetics, dramatics, film, tantric kitten juggling and nude traffic directing." He described it as 10,000 square feet of vintage redwood and antique steel "styled beyond compare."

His 1951 purple Plymouth remained parked Saturday in front of the building that burned so hot, the “Ghost Ship” letters painted across the front had all but melted away.

"They are ex-Burning Man people and had their kids in the place -- three kids running around with no shoes," said DeL Lee, 34, who lived there for three months two years ago. "It was nuts."

He described the place as a filthy firetrap, with frequent power outages, overloaded outlets, sparks and the smell of burning wire. A camping stove with butane tanks served as the kitchen, and a hole had been chiseled through the concrete wall to access the bathroom at the adjoining automotive repair shop next door.

The staircase, which had two switchbacks to get to the second floor, was built of pallets, plywood and footholds -- like a ship’s gangplank -- and was like "climbing a fort" to get up and down, say people who had visited the building.

Pianos and old couches doubled as room dividers. Pallets covered with shingles and elaborate trim formed sculptural walls. Often, Lee said, the place was filled with the sounds of sawing and hammering as Alemany continued to build.

And the OFD report confirms that chaos:

The front staircase was located along the east wall at the front of the structure. It was constructed of various wooden planks and wooden studs, as well as portions of wooden pallets at its top where it accessed the second floor. One of two bathrooms was elevated slightly off ground level where the lower staircase landing was located. The orientation of the staircase was such that it first led eastward where it bordered the east wall, then turned north (rearward), where it then turned slightly west at the very top of the staircase at the second floor.

As the Mercury News observes, releasing the report is a milestone but it's not the last we'll hear of this; the next steps will involve the courts:

Almena and the collective’s creative director, Max Harris, are charged with 36 counts of involuntary manslaughter in Alameda County Superior Court. Prosecutors charge the men knowingly created a fire trap and invited the public inside. Lawyers for the two men say their clients are being used as scapegoats and say the building owner Chor Ng should be facing criminal charges.

Ng, Almena, PG&E, are named in a wrongful death lawsuit that is in its early stages. The City of Oakland is expected to be named in the suit as soon as this week.

“Of course, we’d like to have them say what the cause of the fire is but we also have our experts and they are not done with their analysis. I don’t think it’s the end of the story,” said Mary Alexander, the lead attorney for the victim’s families.

Meanwhile, as Rick Paulas writes at The Awl, the media attention has already brought many changes to the "artist collective" aspect of the Bay Area and beyond: How The Media Mishandled The Ghost Ship Fire

“It was a tragedy, then it became a tragedy on an entirely different spectrum,” said Friday. “The original was the loss of such vibrant, wonderful people. And that was forgotten and it just became about housing ordinances, and this witch hunt of warehouses.”

The hunt continues, not just in Oakland, but around the country. City governments — rather than focusing efforts on bringing warehouses up to code, of empathizing with tenants forced to live in unsafe conditions due to increasing rents and lower wages, hell, even pragmatically understanding the added value these fringe residents add to a city’s cultural cachet — are simply closing down venues, then moving on to close down the next. Two days after Ghost Ship, the tenants of Baltimore’s iconic Bell Foundry were evicted. A week after that, the punk venue Burnt Ramen in Richmond, CA was shut down. On April 27th, eight people living in an artist collective in San Francisco were evicted.

The cities say they don’t want another Ghost Ship, implying they mean another massive loss of life. But the speed at which they’re performing these evictions — and the lack of solutions they’re offering to those displaced — suggests that what they really don’t want is another Ghost Ship media event: vultures descending, camera lights illuminating the dark corners of their own institutional failures.

It's important that we not forget the Ghost Ship tragedy, but it's a hard story to (re-)read and (re-)consider, with plenty of sadness to go around.

Categories: FLOSS Project Planets

Norbert Preining: TeX Live 2017 hits Debian/unstable

Planet Debian - Mon, 2017-06-19 21:09

Yesterday I uploaded the first packages of TeX Live 2017 to Debian/unstable, meaning that the new release cycle has started. Debian/stretch was released over the weekend, and this opened up unstable for new developments. The upload comprised the following packages: asymptote, cm-super, context, context-modules, texlive-base, texlive-bin, texlive-extra, texlive-extra, texlive-lang, texworks, xindy.

I mentioned already in a previous post the following changes:

  • several packages have been merged, some are dropped (eg. texlive-htmlxml) and one new package (texlive-plain-generic) has been added
  • luatex got updated to 1.0.4, and is now considered stable
  • updmap and fmtutil now require either -sys or -user
  • tlmgr got a shell mode (interactive/scripting interface) and a new feature to add arbitrary TEXMF trees (conf auxtrees)

The last two changes are described together with other news (easy TEXMF tree management) in the TeX Live release post. These changes more or less sum up the new infra structure developments in TeX Live 2017.

Since the last release to unstable (which happened in 2017-01-23) about half a year of package updates have accumulated, below is an approximate list of updates (not split into new/updated, though).

Enjoy the brave new world of TeX Live 2017, and please report bugs to the BTS!

Updated/new packages:
academicons, achemso, acmart, acro, actuarialangle, actuarialsymbol, adobemapping, alkalami, amiri, animate, aomart, apa6, apxproof, arabluatex, archaeologie, arsclassica, autoaligne, autobreak, autosp, axodraw2, babel, babel-azerbaijani, babel-english, babel-french, babel-indonesian, babel-japanese, babel-malay, babel-ukrainian, bangorexam, baskervaldx, baskervillef, bchart, beamer, beamerswitch, bgteubner, biblatex-abnt, biblatex-anonymous, biblatex-archaeology, biblatex-arthistory-bonn, biblatex-bookinother, biblatex-caspervector, biblatex-cheatsheet, biblatex-chem, biblatex-chicago, biblatex-claves, biblatex-enc, biblatex-fiwi, biblatex-gb7714-2015, biblatex-gost, biblatex-ieee, biblatex-iso690, biblatex-manuscripts-philology, biblatex-morenames, biblatex-nature, biblatex-opcit-booktitle, biblatex-oxref, biblatex-philosophy, biblatex-publist, biblatex-shortfields, biblatex-subseries, bibtexperllibs, bidi, biochemistry-colors, bookcover, boondox, bredzenie, breqn, bxbase, bxcalc, bxdvidriver, bxjalipsum, bxjaprnind, bxjscls, bxnewfont, bxorigcapt, bxpapersize, bxpdfver, cabin, callouts, chemfig, chemformula, chemmacros, chemschemex, childdoc, circuitikz, cje, cjhebrew, cjk-gs-integrate, cmpj, cochineal, combofont, context, conv-xkv, correctmathalign, covington, cquthesis, crimson, crossrefware, csbulletin, csplain, csquotes, css-colors, cstldoc, ctex, currency, cweb, datetime2-french, datetime2-german, datetime2-romanian, datetime2-ukrainian, dehyph-exptl, disser, docsurvey, dox, draftfigure, drawmatrix, dtk, dviinfox, easyformat, ebproof, elements, endheads, enotez, eqnalign, erewhon, eulerpx, expex, exsheets, factura, facture, fancyhdr, fbb, fei, fetamont, fibeamer, fithesis, fixme, fmtcount, fnspe, fontmfizz, fontools, fonts-churchslavonic, fontspec, footnotehyper, forest, gandhi, genealogytree, glossaries, glossaries-extra, gofonts, gotoh, graphics, graphics-def, graphics-pln, grayhints, gregoriotex, gtrlib-largetrees, gzt, halloweenmath, handout, hang, heuristica, hlist, hobby, hvfloat, hyperref, hyperxmp, ifptex, ijsra, japanese-otf-uptex, jlreq, jmlr, jsclasses, jslectureplanner, karnaugh-map, keyfloat, knowledge, komacv, koma-script, kotex-oblivoir, l3, l3build, ladder, langsci, latex, latex2e, latex2man, latex3, latexbug, latexindent, latexmk, latex-mr, leaflet, leipzig, libertine, libertinegc, libertinus, libertinust1math, lion-msc, lni, longdivision, lshort-chinese, ltb2bib, lualatex-math, lualibs, luamesh, luamplib, luaotfload, luapackageloader, luatexja, luatexko, lwarp, make4ht, marginnote, markdown, mathalfa, mathpunctspace, mathtools, mcexam, mcf2graph, media9, minidocument, modular, montserrat, morewrites, mpostinl, mptrees, mucproc, musixtex, mwcls, mweights, nameauth, newpx, newtx, newtxtt, nfssext-cfr, nlctdoc, novel, numspell, nwejm, oberdiek, ocgx2, oplotsymbl, optidef, oscola, overlays, pagecolor, pdflatexpicscale, pdfpages, pdfx, perfectcut, pgfplots, phonenumbers, phonrule, pkuthss, platex, platex-tools, polski, preview, program, proofread, prooftrees, pst-3dplot, pst-barcode, pst-eucl, pst-func, pst-ode, pst-pdf, pst-plot, pstricks, pstricks-add, pst-solides3d, pst-spinner, pst-tools, pst-tree, pst-vehicle, ptex2pdf, ptex-base, ptex-fontmaps, pxbase, pxchfon, pxrubrica, pythonhighlight, quran, ran_toks, reledmac, repere, resphilosophica, revquantum, rputover, rubik, rutitlepage, sansmathfonts, scratch, seealso, sesstime, siunitx, skdoc, songs, spectralsequences, stackengine, stage, sttools, studenthandouts, svg, tcolorbox, tex4ebook, tex4ht, texosquery, texproposal, thaienum, thalie, thesis-ekf, thuthesis, tikz-kalender, tikzmark, tikz-optics, tikz-palattice, tikzpeople, tikzsymbols, titlepic, tl17, tqft, tracklang, tudscr, tugboat-plain, turabian-formatting, txuprcal, typoaid, udesoftec, uhhassignment, ukrainian, ulthese, unamthesis, unfonts-core, unfonts-extra, unicode-math, uplatex, upmethodology, uptex-base, urcls, variablelm, varsfromjobname, visualtikz, xassoccnt, xcharter, xcntperchap, xecjk, xepersian, xetexko, xevlna, xgreek, xsavebox, xsim, ycbook.

Categories: FLOSS Project Planets

Curtis Miller: Walk-Forward Analysis Demonstration with backtrader

Planet Python - Mon, 2017-06-19 20:00
Here I demonstrate walk-forward analysis with backtrader and investigate overfitting with backtesting and optimizing strategies for stock data.
Categories: FLOSS Project Planets

Daniel Bader: Enriching Your Python Classes With Dunder (Magic, Special) Methods

Planet Python - Mon, 2017-06-19 20:00
Enriching Your Python Classes With Dunder (Magic, Special) Methods

What Python’s “magic methods” are and how you would use them to make a simple account class more Pythonic.

What Are Dunder Methods?

In Python, special methods are a set of predefined methods you can use to enrich your classes. They are easy to recognize because they start and end with double underscores, for example __init__ or __str__.

As it quickly became tiresome to say under-under-method-under-under Pythonistas adopted the term “dunder methods”, a short form of “double under.”

These “dunders” or “special methods” in Python are also sometimes called “magic methods.” But using this terminology can make them seem more complicated than they really are—at the end of the day there’s nothing “magical” about them. You should treat these methods like a normal language feature.

Dunder methods let you emulate the behavior of built-in types. For example, to get the length of a string you can call len('string'). But an empty class definition doesn’t support this behavior out of the box:

class NoLenSupport: pass >>> obj = NoLenSupport() >>> len(obj) TypeError: "object of type 'NoLenSupport' has no len()"

To fix this, you can add a __len__ dunder method to your class:

class LenSupport: def __len__(self): return 42 >>> obj = LenSupport() >>> len(obj) 42

Another example is slicing. You can implement a __getitem__ method which allows you to use Python’s list slicing syntax: obj[start:stop].

Special Methods and the Python Data Model

This elegant design is known as the Python data model and lets developers tap into rich language features like sequences, iteration, operator overloading, attribute access, etc.

You can see Python’s data model as a powerful API you can interface with by implementing one or more dunder methods. If you want to write more Pythonic code, knowing how and when to use dunder methods is an important step.

For a beginner this might be slightly overwhelming at first though. No worries, in this article I will guide you through the use of dunder methods using a simple Account class as an example.

Enriching a Simple Account Class

Throughout this article I will enrich a simple Python class with various dunder methods to unlock the following language features:

  • Initialization of new objects
  • Object representation
  • Enable iteration
  • Operator overloading (comparison)
  • Operator overloading (addition)
  • Method invocation
  • Context manager support (with statement)

You can find the final code example here. I’ve also put together a Jupyter notebook so you can more easily play with the examples.

Object Initialization: __init__

Right upon starting my class I already need a special method. To construct account objects from the Account class I need a constructor which in Python is the __init__ dunder:

class Account: """A simple account class""" def __init__(self, owner, amount=0): """ This is the constructor that lets us create objects from this class """ self.owner = owner self.amount = amount self._transactions = []

The constructor takes care of setting up the object. In this case it receives the owner name, an optional start amount and defines an internal transactions list to keep track of deposits and withdrawals.

This allows us to create new accounts like this:

>>> acc = Account('bob') # default amount = 0 >>> acc = Account('bob', 10) Object Representation: __str__, __repr__

It’s common practice in Python to provide a representation of your object for the consumer of your class (a bit like API documentation.) There are two ways to do this using dunder methods:

  1. __repr__: The “official” string representation of an object. This is how you would make an object of the class. The goal of __repr__ is to be unambiguous.

  2. __str__: The “informal” or nicely printable string representation of an object. This is for the enduser.

Let’s implement these two methods on the Account class:

class Account: # ... (see above) def __repr__(self): return 'Account({!r}, {!r})'.format(self.owner, self.amount) def __str__(self): return 'Account of {} with starting amount: {}'.format( self.owner, self.amount)

If you don’t want to hardcode "Account" as the name for the class you can also use self.__class__.__name__ to access it programmatically.

If you wanted to implement just one of these to-string methods on a Python class, make sure it’s __repr__.

Now I can query the object in various ways and always get a nice string representation:

>>> str(acc) 'Account of bob with starting amount: 10' >>> print(acc) "Account of bob with starting amount: 10" >>> repr(acc) "Account('bob', 10)" Iteration: __len__, __getitem__, __reversed__

In order to iterate over our account object I need to add some transactions. So first, I’ll define a simple method to add transactions. I’ll keep it simple because this is just setup code to explain dunder methods, and not a production-ready accounting system:

def add_transaction(self, amount): if not isinstance(amount, int): raise ValueError('please use int for amount') self._transactions.append(amount)

I also defined a property to calculate the balance on the account so I can conveniently access it with account.balance. This method takes the start amount and adds a sum of all the transactions:

@property def balance(self): return self.amount + sum(self._transactions)

Let’s do some deposits and withdrawals on the account:

>>> acc = Account('bob', 10) >>> acc.add_transaction(20) >>> acc.add_transaction(-10) >>> acc.add_transaction(50) >>> acc.add_transaction(-20) >>> acc.add_transaction(30) >>> acc.balance 80

Now I have some data and I want to know:

  1. How many transactions were there?

  2. Index the account object to get transaction number …

  3. Loop over the transactions

With the class definition I have this is currently not possible. All of the following statements raise TypeError exceptions:

>>> len(acc) TypeError >>> for t in acc: ... print(t) TypeError >>> acc[1] TypeError

Dunder methods to the rescue! It only takes a little bit of code to make the class iterable:

class Account: # ... (see above) def __len__(self): return len(self._transactions) def __getitem__(self, position): return self._transactions[position]

Now the previous statements work:

>>> len(acc) 5 >>> for t in acc: ... print(t) 20 -10 50 -20 30 >>> acc[1] -10

To iterate over transactions in reversed order you can implement the __reversed__ special method:

def __reversed__(self): return sorted(self, reverse=True) >>> list(reversed(acc)) [50, 30, 20, -10, -20]

By the way, I wrapped this in a list() call because reversed() returns a generator object in Python 3.

All in all, this looks quite Pythonic to me now.

Operator Overloading for Comparing Accounts: __eq__, __lt__

We all write dozens of statements daily to compare Python objects:

>>> 2 > 1 True >>> 'a' > 'b' False

This feels completely natural, but it’s actually quite amazing what happens behind the scenes here. Why does > work equally well on integers, strings and other objects (as long as they are the same type)? This polymorphic behavior is possible because these objects implement one or more comparison dunder methods.

An easy way to verify this is to use the dir() builtin:

>>> dir('a') ['__add__', ... '__eq__', <--------------- '__format__', '__ge__', <--------------- '__getattribute__', '__getitem__', '__getnewargs__', '__gt__', <--------------- ...]

Let’s build a second account object and compare it to the first one (I am adding a couple of transactions for later use):

>>> acc2 = Account('tim', 100) >>> acc2.add_transaction(20) >>> acc2.add_transaction(40) >>> acc2.balance 160 >>> acc2 > acc TypeError: "'>' not supported between instances of 'Account' and 'Account'"

What happened here? We got a TypeError because I have not implemented any comparison dunders nor inherited them from a parent class.

Let’s add them. To not have to implement all of the comparison dunder methods, I use the functools.total_ordering decorator which allows me to take a shortcut, only implementing __eq__ and __lt__:

from functools import total_ordering @total_ordering class Account: # ... (see above) def __eq__(self, other): return self.balance == other.balance def __lt__(self, other): return self.balance < other.balance

And now I can compare Account instances no problem:

>>> acc2 > acc True >>> acc2 < acc False >>> acc == acc2 False Operator Overloading for Merging Accounts: __add__

In Python, everything is an object. We are completely fine adding two integers or two strings with the + (plus) operator, it behaves in expected ways:

>>> 1 + 2 3 >>> 'hello' + ' world' 'hello world'

Again, we see polymorphism at play: Did you notice how + behaves different depending the type of the object? For integers it sums, for strings it concatenates. Again doing a quick dir() on the object reveals the corresponding “dunder” interface into the data model:

>>> dir(1) [... '__add__', ... '__radd__', ...]

Our Account object does not support addition yet, so when you try to add two instances of it there’s a TypeError:

>>> acc + acc2 TypeError: "unsupported operand type(s) for +: 'Account' and 'Account'"

Let’s implement __add__ to be able to merge two accounts. The expected behavior would be to merge all attributes together: the owner name, as well as starting amounts and transactions. To do this we can benefit from the iteration support we implemented earlier:

def __add__(self, other): owner = '{}&{}'.format(self.owner, other.owner) start_amount = self.amount + other.amount acc = Account(owner, start_amount) for t in list(self) + list(other): acc.add_transaction(t) return acc

Yes, it is a bit more involved than the other dunder implementations so far. It should show you though that you are in the driver’s seat. You can implement addition however you please. If we wanted to ignore historic transactions—fine, you can also implement it like this:

def __add__(self, other): owner = self.owner + other.owner start_amount = self.balance + other.balance return Account(owner, start_amount)

I think the former implementation would be more realistic though, in terms of what a consumer of this class would expect to happen.

Now we have a new merged account with starting amount $110 (10 + 100) and balance of $240 (80 + 160):

>>> acc3 = acc2 + acc >>> acc3 Account('tim&bob', 110) >>> acc3.amount 110 >>> acc3.balance 240 >>> acc3._transactions [20, 40, 20, -10, 50, -20, 30]

Note this works in both directions because we’re adding objects of the same type. In general, if you would add your object to a builtin (int, str, …) the __add__ method of the builtin wouldn’t know anything about your object. In that case you need to implement the reverse add method (__radd__) as well. You can see an example for that here.

Callable Python Objects: __call__

You can make an object callable like a regular function by adding the __call__ dunder method. For our account class we could print a nice report of all the transactions that make up its balance:

class Account: # ... (see above) def __call__(self): print('Start amount: {}'.format(self.amount)) print('Transactions: ') for transaction in self: print(transaction) print('\nBalance: {}'.format(self.balance))

Now when I call the object with the double-parentheses acc() syntax, I get a nice account statement with an overview of all transactions and the current balance:

>>> acc = Account('bob', 10) >>> acc.add_transaction(20) >>> acc.add_transaction(-10) >>> acc.add_transaction(50) >>> acc.add_transaction(-20) >>> acc.add_transaction(30) >>> acc() Start amount: 10 Transactions: 20 -10 50 -20 30 Balance: 80

Please keep in mind that this is just a toy example. A “real” account class probably wouldn’t print to the console when you use the function call syntax on one of its instances. In general, the downside of having a __call__ method on your objects is that it can be hard to see what the purpose of calling the object is.

Most of the time it’s therefore better to add an explicit method to the class. In this case it probably would’ve been more transparent to have a separate Account.print_statement() method.

Context Manager Support and the With Statement: __enter__, __exit__

My final example in this tutorial is about a slightly more advanced concept in Python: Context managers and adding support for the with statement.

Now, what is a “context manager” in Python? Here’s a quick overview:

A context manager is a simple “protocol” (or interface) that your object needs to follow so it can be used with the with statement. Basically all you need to do is add __enter__ and __exit__ methods to an object if you want it to function as a context manager.

Let’s use context manager support to add a rollback mechanism to our Account class. If the balance goes negative upon adding another transaction we rollback to the previous state.

We can leverage the Pythonic with statement by adding two more dunder methods. I’m also adding some print calls to make the example clearer when we demo it:

class Account: # ... (see above) def __enter__(self): print('ENTER WITH: Making backup of transactions for rollback') self._copy_transactions = list(self._transactions) return self def __exit__(self, exc_type, exc_val, exc_tb): print('EXIT WITH:', end=' ') if exc_type: self._transactions = self._copy_transactions print('Rolling back to previous transactions') print('Transaction resulted in {} ({})'.format( exc_type.__name__, exc_val)) else: print('Transaction OK')

As an exception has to be raised to trigger a rollback, I define a quick helper method to validate the transactions in an account:

def validate_transaction(acc, amount_to_add): with acc as a: print('Adding {} to account'.format(amount_to_add)) a.add_transaction(amount_to_add) print('New balance would be: {}'.format(a.balance)) if a.balance < 0: raise ValueError('sorry cannot go in debt!')

Now I can use an Account object with the with statement. When I make a transaction to add a positive amount, all is good:

acc4 = Account('sue', 10) print('\nBalance start: {}'.format(acc4.balance)) validate_transaction(acc4, 20) print('\nBalance end: {}'.format(acc4.balance))

Executing the above Python snippet produces the following printout:

Balance start: 10 ENTER WITH: Making backup of transactions for rollback Adding 20 to account New balance would be: 30 EXIT WITH: Transaction OK Balance end: 30

However when I try to withdraw too much money, the code in __exit__ kicks in and rolls back the transaction:

acc4 = Account('sue', 10) print('\nBalance start: {}'.format(acc4.balance)) try: validate_transaction(acc4, -50) except ValueError as exc: print(exc) print('\nBalance end: {}'.format(acc4.balance))

In this case we get a different result:

Balance start: 10 ENTER WITH: Making backup of transactions for rollback Adding -50 to account New balance would be: -40 EXIT WITH: Rolling back to previous transactions ValueError: sorry cannot go in debt! Balance end: 10 Conclusion

I hope you feel a little less afraid of dunder methods after reading this article. A strategic use of them makes your classes more Pythonic, because they emulate builtin types with Python-like behaviors.

As with any feature, please don’t overuse it. Operator overloading, for example, can get pretty obscure. Adding “karma” to a person object with +bob or tim << 3 is definitely possible using dunders—but might not be the most obvious or appropriate way to use these special methods. However, for common operations like comparison and additions they can be an elegant approach.

Showing each and every dunder method would make for a very long tutorial. If you want to learn more about dunder methods and the Python data model I recommend you go through the Python reference documentation.

Also, be sure to check out our dunder method coding challenge where you can experiment and put your newfound “dunder skills” to practice.

Categories: FLOSS Project Planets

Arcanist Set Up

Planet KDE - Mon, 2017-06-19 20:00
Set up the arcanist for Koko
  • It was pretty much easy to install. For my Archlinux just below command did the work for me.

    yaourt -S arcanist-git

  • Then I had to add .arcconfig to the Koko repository so that arc could point to the link where it should publish the changes

    { “phabricator.uri”:”https://phabricator.kde.org/” }

  • The only problem is with the SSL certificates, as the university campus wireless network uses its own self-signed certificate. This will create problem to access the ssl encrypted web content, pretty much everything related to development :P
  • Also university campus network does not allow SSH over it’s network. This will prohibit me from committing the changes to the git repository.
  • Hence to use the arcanist, everytime I will have to check for the curl.cainfo in the /etc/php/php.ini and set/unset the environment variable GIT_SSL_CAINFO depending on the network I am using.
Categories: FLOSS Project Planets

Jeremy Bicha: GNOME Tweak Tool 3.25.3

Planet Debian - Mon, 2017-06-19 19:15

Today I released the second development snapshot (3.25.3) of what will be GNOME Tweak Tool 3.26.

I consider the initial User Interface (UI) rework proposed by the GNOME Design Team to be complete now. Every page in Tweak Tool has been updated, either in this snapshot or the previous development snapshot.

The hard part still remains: making the UI look as good as the mockups. Tweak Tool’s backend makes this a bit more complicated than usual for an app like this.

Here are a few visual highlights of this release.

The Typing page has been moved into an Additional Layout Options dialog in the Keyboard & Mouse page. Also, the Compose Key option has been given its own dialog box.

Florian Müllner added content to the Extensions page that is shown if you don’t have any GNOME Shell extensions installed yet.

A hidden feature that GNOME has had for a long time is the ability to move the Application Menu from the GNOME top bar to a button in the app’s title bar. This is easy to enable in Tweak Tool by turning off the Application Menu switch in the Top Bar page. This release improves how well that works, especially for Ubuntu users where the required hidden appmenu window button was probably not pre-configured.

Some of the ComboBoxes have been replaced by ListBoxes. One example is on the Workspaces page where the new design allows for more information about the different options. The ListBoxes are also a lot easier to select than the smaller ComboBoxes were.

For details of these and other changes, see the commit log or the NEWS file.

GNOME Tweak Tool 3.26 will be released alongside GNOME 3.26 in mid-September.

Categories: FLOSS Project Planets

Commerce Guys: Drupal Commerce integration with Square payments released

Planet Drupal - Mon, 2017-06-19 15:41

Square's pitch is pretty straightforward: accept payments anywhere, no coding required. They nailed this first through their simple phone based card readers and their slick in-store tablet interface. They also made it easy to process all major credit cards, guaranteeing deposits as soon as the next business day.

They're now rolling out their same great support for merchants online with the steady release of open APIs for eCommerce applications. With our recently released Commerce Square module for Drupal 7 and Drupal 8, you can now pitch Drupal Commerce to existing Square customers in your area.

Thus far we integrate their payment APIs for full checkout and administrative support in Commerce 1.x and 2.x. We're working with Square to integrate new APIs as they become available in pursuit of our vision to enable the turnkey creation of online stores for Square merchants using Drupal Commerce.

Among the module's primary benefits, especially when used in conjunction with Square for retail sales, are:

  • You can sell online, and in person, with all of your sales in one place. Integrating your physical and online retail operations will ultimately simplify the management of your business.
  • Accept all major cards, (including Apple Pay and Android Pay in-store) and pay one simple rate per tap, dip, or swipe. (Note: Square also offers custom rates for qualifying stores processing over $250k annually.)
  • Integration is simple and seamless for Drupal Commerce on both Drupal 7 and Drupal 8. (Look for new features to come into the D8 branch first and be backported, as additional contributed modules may be required to make up for core features that were added in Commerce 2.x.)

Square’s eCommerce API is on the bleeding edge of integration technology, making PCI compliance easy; customer credit card information never touches your website, so you don’t need to worry or complete a single checklist. Each component of the payment method form is an embedded iframe hosted on Square's server and returns a payment token identifier used to capture payment. We actually designed the Commerce 2.x checkout form to treat these types of payment methods as first class citizens, so expect the integration to just work.

Sign up for Square and get free processing on your first $5,000 in sales.

As part of our module launch effort, Square has teamed up with us to offer free processing on your first $5,000 in sales. If you or one of your customers are interested in learning more, review the offer details here and hit us up in the issue queue if you run into any problems!

Categories: FLOSS Project Planets

Pimping KRuler

Planet KDE - Mon, 2017-06-19 14:27

KRuler, in case you don't know it, is a simple software ruler to measure lengths on your desktop. It is one of the oldest KDE tools, its first commit dating from November 4th, 2000. Yes, it's almost old enough to vote.

I am a long time KRuler user. It gets the job done, but I have often found myself saying "one day I'll fix this or that". And never doing it.

Hidpi screen really hurt the poor app, so I finally decided to do something and spend some time during my daily commute on it.

This is what it looked like on my screen when I started working on it:

As any developer would, I expected it should not be more than a week of work... Of course it took way longer than that, because there was always something odd here and there, preventing me from pushing a patch.

I started by making KRuler draw scale numbers less often to avoid ugly overlapping texts. I then made it draw ticks on both sides, to go from 4 orientations (North, South, West, East) to 2: vertical or horizontal.

The optional rotation and buttons were getting in the way though: the symmetric ticks required the scale numbers to be vertically centered so buttons were overlapping it. I decided to remove them (they were already off by default). With only two orientations it is less useful to have rotation buttons anyway: it is simple enough to use either the context menu, middle-click the ruler, or the R shortcut to change the orientation. Closing is accessible through the context menu as well.

One of the reasons (I think) for the 4 orientations was the color picker feature. It makes little sense to me to have a color picker in a ruler: it is more natural to use KColorChooser to pick colors. I removed the color picker, allowing me to remove the oddly shaped mouse cursor and refresh the appearance of the length indicator to something a bit nicer.

I then made it easier to adjust the length of the ruler by dragging its edges instead of having to pick the appropriate length from a sub-menu of the context menu. This made it possible to remove this sub-menu.

This is what KRuler looks like now:

That is only part 1 though. I originally had 2 smaller patches to add, but Jonathan Riddell, who kindly reviewed the monster patch, requested another small fix, so that makes 3 patches to go. Need to setup and figure out how to use Arcanist to submit them to Phabricator, as I have been told Review Board is old school these days :)

Categories: FLOSS Project Planets

Weekly Python Chat: Python Asterisks

Planet Python - Mon, 2017-06-19 14:00

The * and unpacking operators (often seen as *args and kwargs) got a big upgrade in Python 3. We're going to chat about how * and ** work, what they're good for, and what to watch out for in different versions of Python.

Categories: FLOSS Project Planets

Acquia Developer Center Blog: Introducing Reservoir, a Distribution for Decoupling Drupal

Planet Drupal - Mon, 2017-06-19 13:51

Reservoir is an experimental Drupal distribution that is an exceptional starting point for any decoupled Drupal implementation. It is also designed to on-board developers of all backgrounds: a decoupled Drupal distribution and optimal back end for every developer.

Tags: acquia drupal planet
Categories: FLOSS Project Planets

Shirish Agarwal: Seizures, Vigo and bi-pedal motion

Planet Debian - Mon, 2017-06-19 12:49

Dear all, an update is in order. While talking to physiotherapist couple of days before, came to know the correct term to what was I experiencing. I had experienced convulsive ‘seizure‘ , spasms being a part of it. Reading the wikipedia entry and the associated links/entries it seems I am and was very very lucky.

The hospital or any hospital is a very bad bad place. I have seen all horror movies which people say are disturbing but have never been disturbed as much as I was in hospital. I couldn’t help but hear people’s screams and saw so many cases which turned critical. At times it was not easy to remain positive but dunno from where there was a will to live which pushed me and is still pushing me.

One of the things that was painful for a long time were the almost constant stream of injections that were injected in me. It was almost an afterthought that the nurse put a Vigo in me.

While the above medical device is similar, mine had a cross, the needle was much shorter and is injected into the vein. After that all injections are injected into that including common liquid which is salt,water and something commonly given to patients to stabilize first. I am not remembering the name atm.

I also had a urine bag which was attached to my penis in a non-invasive manner. Both my grandfather and grandma used to cry when things went wrong while I didn’t feel any pain but when the urine bag was disattached and attached again, so seems things have improved there.

I was also very conscious of getting bed sores as both my grandpa and grandma had them when in hospital. As I had no strength I had to beg. plead do everything to make sure that every few hours I was turned from one side to other. I also had an air bag which is supposed to alleviate or relief this condition.

Constant physiotherapy every day for a while slowly increased my strength and slowly both the vigo and feeding tube put inside my throat was removed.

I have no remembrance as to when they had put the feeding tube as it was all rubber and felt bad when it came out.

Further physiotherapy helped me crawl till the top of the bed, the bed was around 6 feet in length and and more than enough so I could turn both sides without falling over.

Few days later I found I could also sit up using my legs as a lever and that gave confidence to the doctors to remove the air bed so I could crawl more easily.

Couple of more days later I stood on my feet for the first time and it was like I had lead legs. Each step was painful but the sense and feeling of independence won over whatever pain was there.

I had to endure wet wipes from nurses and ward boys in place of a shower everyday and while they were respectful always it felt humiliating.

The first time I had a bath after 2 weeks or something, every part of my body cried and I felt like a weakling. I had thought I wouldn’t be able to do justice to the physiotherapy session which was soon after but after the session was back to feeling normal.

For a while I was doing the penguin waddle which while painful was also had humor in it. I did think of shooting the penguin waddle but decided against it as I was half-naked most of the time ( the hospital clothes never fit me properly)

Cut to today and I was able to climb up and down the stairs on my own and circled my own block, slowly but was able to do it on my own by myself.

While I always had a sense of wonderment for bi-pedal motion as well as all other means of transport, found much more respect of walking. I live near a fast food eating joint so I see lot of youngsters posing in different ways with their legs to show interest to their mates. And this I know happens both on the conscious and sub-conscious levels. To be able to see and discern that also put a sense of wonder in nature’s creations.

All in all, I’m probabl6y around 40% independent and still 60% interdependent. I know I have to be patient with myself and those around me and explain to others what I’m going through.

For e.g. I still tend to spill things and still can’t touch-type much.

So, the road is long, I can only pray and hope best wishes for anybody who is my condition and do pray that nobody goes through what I went through, especiallly not children.

I am also hoping that things like DxtER and range of non-invasive treatments make their way into India and the developing world at large.

Anybody who is overweight and is either disgusted or doesn’t like the gym route, would recommend doing sessions with a physiotherapist that you can trust. You have to trust that her judgement will push you a bit more and not more that the gains you make are toppled over.

I still get dizziness spells while doing therapy but will to break it as I know dizziness doesn’t help me.

I hope my writings give strength and understanding to either somebody who is going through it, or relatives or/and caregivers so they know the mental status of the person who’s going through it.

Till later and sorry it became so long.

Update – I forgot to share this inspirational story from my city which I shared with a friend days ago. Add to that, she is from my city. What it doesn’t share is that Triund is a magical place. I had visited once with a friend who had elf ears (he had put on elf ears) and it is kind of place which alchemist talks about, a place where imagination does turn wild and there is magic in the air.


Filed under: Miscellenous Tagged: #air bag, #bed sores, #convulsive epileptic seizure, #crawling, #horror, #humiliation, #nakedness, #penguin waddle, #physiotherapy, #planet-debian, #spilling things, #urine bag, #Vigo medical device
Categories: FLOSS Project Planets

pythonwise: Who Touched the Code Last? (git)

Planet Python - Mon, 2017-06-19 12:47
Sometimes I'd like to know who to ask about a piece of code. I've developed a little Python script that shows the last people who touch a file/directory and the ones who touched it most.

Example output (on arrow project)
$ owners
Most: Wes McKinney (39.5%), Uwe L. Korn (15.3%), Kouhei Sutou (10.8%)
Last: Kengo Seki (31M), Uwe L. Korn (39M), Max Risuhin (23H)

So ask Wes or Uwe :)
Here's the code:
Categories: FLOSS Project Planets

Colm O hEigeartaigh: Querying Apache HBase using Talend Open Studio for Big Data

Planet Apache - Mon, 2017-06-19 12:23
Recent blog posts have described how to set up authorization for Apache HBase using Apache Ranger. However the posts just covered inputing and reading data using the HBase Shell. In this post, we will show how Talend Open Studio for Big Data can be used to read data stored in Apache HBase. This post is along the same lines of other recent tutorials on reading data from Kafka and HDFS.

1) HBase setup

Follow this tutorial on setting up Apache HBase in standalone mode, and creating a 'data' table with some sample values using the HBase Shell.

2) Download Talend Open Studio for Big Data and create a job

Now we will download Talend Open Studio for Big Data (6.4.0 was used for the purposes of this tutorial). Unzip the file when it is downloaded and then start the Studio using one of the platform-specific scripts. It will prompt you to download some additional dependencies and to accept the licenses. Click on "Create a new job" called "HBaseRead". In the search bar on the right-hand side, enter "hbase" and hit enter. Drag "tHBaseConnection" and "tHBaseInput" onto the palette, as well as "tLogRow".

"tHBaseConnection" is used to set up the connection to "HBase", "tHBaseInput" uses the connection to read data from HBase, and "tLogRow" will log the data that was read so that we can see that the job ran successfully. Right-click on "tHBaseConnection" and select "Trigger/On Subjob Ok" and drag the resulting arrow to the "tHBaseInput" component. Now right click on "tHBaseInput" and select "Row/Main" and drag the arrow to "tLogRow".
3) Configure the components

Now let's configure the individual components. Double click on "tHBaseConnection" and select the distribution "Hortonworks" and Version "HDP V2.5.0" (from an earlier tutorial we are using HBase 1.2.6). We are not using Kerberos here so we can skip the rest of the security configuration. Now double click on "tHBaseInput". Select the "Use an existing connection" checkbox. Now hit "Edit Schema" and add two entries to map the column we created in two different column families: "c1" which matches DB "col1" of type String, and "c2" which matches DB "col1" of type String.


Select "data" for the table name back in tHBaseInput and add a mapping for "c1" to "colfam1", and "c2" to "colfam2".


Now we are ready to run the job. Click on the "Run" tab and then hit the "Run" button. You should see "val1" and "val2" appear in the console window.
Categories: FLOSS Project Planets

Vasudev Kamath: Update: - Shell pipelines with subprocess crate and use of Exec::shell function

Planet Debian - Mon, 2017-06-19 11:18

In my previous post I used Exec::shell function from subprocess crate and passed it string generated by interpolating --author argument. This string was then run by the shell via Exec::shell. After publishing post I got ping on IRC by Jonas Smedegaard and Paul Wise that I should replace Exec::shell, as it might be prone to errors or vulnerabilities of shell injection attack. Indeed they were right, in hurry I did not completely read the function documentation which clearly mentions this fact.

When invoking this function, be careful not to interpolate arguments into the string run by the shell, such as Exec::shell(format!("sort {}", filename)). Such code is prone to errors and, if filename comes from an untrusted source, to shell injection attacks. Instead, use Exec::cmd("sort").arg(filename).

Though I'm not directly taking input from untrusted source, its still possible that the string I got back from git log command might contain some oddly formatted string with characters of different encoding which could possibly break the Exec::shell , as I'm not sanitizing the shell command. When we use Exec::cmd and pass argument using .args chaining, the library takes care of creating safe command line. So I went in and modified the function to use Exec::cmd instead of Exec::shell.

Below is updated function.

fn copyright_fromgit(repo: &str) -> Result<Vec<String>> { let tempdir = TempDir::new_in(".", "debcargo")?; Exec::cmd("git") .args(&["clone", "--bare", repo, tempdir.path().to_str().unwrap()]) .stdout(subprocess::NullFile) .stderr(subprocess::NullFile) .popen()?; let author_process = { Exec::shell(OsStr::new("git log --format=\"%an <%ae>\"")).cwd(tempdir.path()) | Exec::shell(OsStr::new("sort -u")) }.capture()?; let authors = author_process.stdout_str().trim().to_string(); let authors: Vec<&str> = authors.split('\n').collect(); let mut notices: Vec<String> = Vec::new(); for author in &authors { let author_string = format!("--author={}", author); let first = { Exec::cmd("/usr/bin/git") .args(&["log", "--format=%ad", "--date=format:%Y", "--reverse", &author_string]) .cwd(tempdir.path()) | Exec::shell(OsStr::new("head -n1")) }.capture()?; let latest = { Exec::cmd("/usr/bin/git") .args(&["log", "--format=%ad", "--date=format:%Y", &author_string]) .cwd(tempdir.path()) | Exec::shell("head -n1") }.capture()?; let start = i32::from_str(first.stdout_str().trim())?; let end = i32::from_str(latest.stdout_str().trim())?; let cnotice = match start.cmp(&end) { Ordering::Equal => format!("{}, {}", start, author), _ => format!("{}-{}, {}", start, end, author), }; notices.push(cnotice); } Ok(notices) }

I still use Exec::shell for generating author list, this is not problematic as I'm not interpolating arguments to create command string.

Categories: FLOSS Project Planets

Adding API dox QCH files generation to KDE Frameworks builds

Planet KDE - Mon, 2017-06-19 10:30

Or: Tying loose ends where some are slightly too short yet.

When

  • you favour offline documentation (not only due to nice integration with IDEs like KDevelop),
  • develop code using KDE Frameworks or other Qt-based libraries,
  • you know all the KF5 libraries have seen many people taking care for API documentation in the code over all the years,
  • and you had read about doxygen’s capability to create API dox in QCH format,
  • and you want your Linux distribution package management to automatically deliver the latest version of the documentation (resp. QCH files) together with the KDE Frameworks libraries and headers (and ideally same for other Qt-based libraries),

the idea is easy derived to just extend the libraries’ buildsystem to also spit out QCH files during the package builds.

It’s all prepared, can ship next week, latest!!1

Which would just be a simple additional target and command, invoking doxygen with some proper configuration file. Right? So simple, you wonder why no-one had done it yet

Some initial challenge seems quickly handled, which is even more encouraging:
for proper documentation one also wants cross-linking to documentation of things used in the API which are from other libraries, e.g. base classes and types. Which requires to pass to doxygen the list of those other documentations together with a set of parameters, to generate proper qthelp:// urls or to copy over documentation for things like inherited methods.
Such listing gets very long especially for KDE Frameworks libraries in tier 3. And with indirect dependencies pulled into the API, on changes the list might get incomplete. Same with any other changes of the parameters for those other documentations.
So basically a similar situation to linking code libraries, which proposes to also give it a similar handling: placing the needed information with CMake config files of the targeted library, so whoever cross-links to the QCH file of that library can fetch the up-to-date information from there.

Things seemed to work okay on first tests, so last September a pull request was made to add some respective macro module to Extra-CMake-Modules to get things going and a blog post “Adding API dox generation to the build by CMake macros” was written.

This… works. You just need to prepare this. And ignore that.

Just, looking closer, lots of glitches popped up on the scene. Worse, even show stoppers made their introduction, at both ends of the process pipe:
At generation side doxygen turned out to have bitrotted for QCH creation, possibly due to lack of use? Time to sacrifice to the Powers of FLOSS, and git clone the sources and poke around to see what is broken and how to fix it. Some time and an accepted pull request later the biggest issue (some content missed to be added to the QCH file) was initially handled, just yet needed to also get out as released version (which it now is since some months).
At consumption side Qt Assistant and Qt Creator turned to be no longer able to properly show QCH files with JavaScript and other HTML5 content, due to QWebKit having been deprecated/dropped and both apps in many distributions now only using QTextBrowser for rendering the documentation pages. And not everyone is using KDevelop and its documentation browser, which uses QWebKit or, in master branch, favours QWebEngine if present.
Which means, an investment into QCH files from doxygen would only be interesting to a small audience. Myself currently without resources and interest to mess around with Qt help engine sources, looks with hope on the resurrection of QWebKit as well as the patch for a QtWebEngine based help engine (if you are Qt-involved, please help and push that patch some more!)

Finally kicking off the production cycle

Not properly working tools, nothing trying to use the tools on bigger scale… classical self-blocking state. So time to break this up and get some momentum into, by tying first things together where possible and enabling the generation of QCH files during builds of the KDE Frameworks libraries.

And thus to current master branches (which will become v5.36 in July) there has been now added for one to Extra-CMake-Modules the new module ECMAddQch and then to all of the KDE Frameworks libraries with public C++ API the option to generate QCH files with the API documentation, on passing -DBUILD_QCH=ON to cmake. If also having passed -DKDE_INSTALL_USE_QT_SYS_PATHS=ON (or installing to same prefix as Qt), the generated QCH files will be installed to places where Qt Assistant and Qt Creator even automatically pick them up and include them as expected:

KDevelop picks them up as well, but needs some manual reconfiguration to do so.

(And of course ECMAddQch is designed to be useful for non-KF5 libraries as well, give it a try once you got hold of it!)

You and getting rid of the remaining obstacles

So while for some setups the generated QCH file of the KDE Frameworks already are useful (I use them since some weeks for e.g. KDevelop development, in KDevelop), for many they still have to become that. Which will take some more time and ideally contributions also from others, including Doxygen and Qt Help engine maintainers.

Here a list of related reported Doxygen bugs:

  • 773693 – Generated QCH files are missing dynsections.js & jquery.js, result in broken display (fixed for v1.8.13 by patch)
  • 773715 – Enabling QCH files without any JavaScript, for viewers without such support
  • 783759 – PERL_PATH config option: when is this needed? Still used?
  • 783762 – QCH files: “Namespaces” or “Files” in the navigation tree get “The page could not be found” (proposed patch)
  • 783768 – QCH files: classes & their constructors get conflicting keyword handling< (proposed patch)
  • YETTOFILE – doxygen tag files contain origin paths for “file”, leaking info and perhaps is an issue with reproducible builds

And a related reported Qt issue:

There is also one related reported CMake issue:

  • 16990 – Wanted: Import support for custom targets (extra bonus: also export support)

And again, it would be also good to see the patch for a QtWebEngine based help engine getting more feedback and pushing by qualified people. And have distributions doing efforts to provide Qt Assistant and Qt Creator with *Web*-based documentation engines (see e.g. bug filed with openSUSE).

May the future be bright^Wdocumented

I am happy to see that Gentoo & FreeBSD packagers have already started to look into extending their KDE Frameworks packaging with generated API dox QCH files for the upcoming 5.36.0 release in July, with other packagers planning to do so soon as well.

So perhaps one not too distant day it will be just normal business to have QCH files with API documentation provided by your distribution not just for the Qt libraries itself, but also for every library based on them. After all documentation has been one of the things making Qt so attractive. As developer of Qt-based software, I very much look forward to that day

Next stop then: QML API documentation


Categories: FLOSS Project Planets

Damián Avila: RISE meets JupyterLab

Planet Python - Mon, 2017-06-19 10:13

JupyterLab is the future for the notebook/authoring experience.

And people started to ask me if we will have RISE on JupyterLab

Do you want to know the answer?

Read more… (1 min remaining to read)

Categories: FLOSS Project Planets

Hideki Yamane: PoC: use Sphinx for debian-policy

Planet Debian - Mon, 2017-06-19 09:09
Before party, we did a monthly study meeting and I gave a talk about tiny hack for debian-policy document.
debian-policy was converted from debian-sgml to docbook in 4.0.0, and my proposal is "Go move forward to Sphinx".

Here's sample, and you can also get PoC source from my GitHub repo and check it.
Categories: FLOSS Project Planets
Syndicate content