KDEPIM without Akonadi

Planet KDE - Fri, 2015-07-03 10:16

As you know, Gentoo is all about flexibility. You can run bleeding edge code (portage, our package manager, even provides you with installation from git master KF5 and friends) or you can focus on stability and trusted code. This is why we've been offering our users for the last years KDEPIM (the version where KMail e-mail storage was not integrated with Akonadi yet, also known as KMail1) as a drop-in replacement for the newer versions.
Recently the Nepomuk search framework has been replaced by Baloo, and after some discussion we decided that for the Nepomuk-related packages it's now time to go. Problem is, the old KDEPIM packages still depend on it via their Akonadi version. This is why - for those of our users who prefer to run KDEPIM 4.4 / KMail1 - we've decided to switch to Pali Rohár's kdepim-noakonadi fork (see also his 2013 blog post and the code).The packages are right now in the KDE overlay, but will move to the main tree after a few days of testing and be treated as an update of KDEPIM
The fork is essentially KDEPIM 4.4 including some additional bugfixes from the KDE/4.4 git branch, with KAddressbook patched back to KDEPIM 4.3 state and references to Akonadi removed elsewhere. This is in some ways a functionality regression since the integration of e.g. different calendar types is lost, however in that version it never really worked perfectly anyway.

For now, you will still need the akonadi-server package, since kdepimlibs (outside kdepim and now at version 4.14.9) requires it to build, but you'll never need to start the Akonadi server. As a consequence, Nepomuk support can be disabled everywhere, and the Nepomuk core and client and Akonadi client packages can be removed by the package manager (--depclean, make sure to first globally disable the nepomuk useflag and rebuild accordingly).

You might ask "Why are you still doing this?"... well. I've been told Akonadi and Baloo is working very nicely, and again I've considered upgrading all my installations... but then on my work desktop where I am using newest and greatest KDE4PIM bug 338658 pops up regularly and stops syncing of important folders. I just don't have the time to pointlessly dig deep into the Akonadi database every few days. So KMail1 it is, and I'll rather spend some time occasionally picking and backporting bugfixes.

Categories: FLOSS Project Planets

Looks as if Wily got Plasma 5.3.2.

Planet KDE - Fri, 2015-07-03 10:06


No backports PPA required.

Plasma 5.3.2.

Daily Wily Images.

Categories: FLOSS Project Planets

GSoC 2015 midterm update

Planet KDE - Fri, 2015-07-03 09:28

I'm working on the project Improve Marble's OSM vector rendering and printing support. The first part of the project is to polish and improve the rendering of the OSM vector tiles in Marble. In my previous post you could see what it looks like without the improvements I have in my mind and now I want to share some pics and videos about the progress and results so far.

Adding outlines for OSM waysOne of the most outstanding issue was the lack of the outlines for streets and highways. My first task was to somehow overcome this problem. These lines are drawn as a QPainter line and thus they don't have style settings for outlines, so the best that I could do is to draw these lines twice: first draw a bit thicker (with 2 pixels) line with the outline color, then a second over top of it with the fill color. This approach of course is not enough, because the streets are not rendered in the order in which we want them. This results in something like this (which I showed you in my previous blogpost):

The solution is to sort the lines (and every object) depending on their z-value, so they are rendered in the correct order. As this sorting is a little bit resource-hungry, the outlines (and further on the decorations) are not rendered below Normal quality setings.I even changed the style of almost every OSM item and added some new, unimplemented items to the list(like cycle ways or pedestrian ways), so they would match even better the standard color theme of the openstreetmap website. The end result looks good I think.

After the improvements
Before the improvements
Decoration and "Fake3D"Implementing this feature led us to the idea to create a more robust API for creating such decorations for the GeoGraphicsItem classes which are used in the rendering process (e.g. the class used to describe the streets is the GeoLineStringGraphicsItem class). With overriding a simple virtual function, one can create custom decorations for every GeoGraphicsItem-derived class. To demonstarte this functionality, the outlines are implemented this way and I added a simple 3D effect (shadow) for the buildings. This takes almost the same approach as the outline rendering, but now this shadow is rendered with a little offset, based on the focuspoint of the map (the little cross at the center). I think this is a pretty good example to demonstrate the capability of such a decoration system.

Struggle with the street labelsAs you can see on the first pictures, the street labels aren't aligned to the streets at all, even if this is a crucial point of urban maps. I thought that this would be an easier task than the decorations, but I was wrong. The proper alignment of the street labels with the right color, right size, right font, right direction etc. it's more like an art, than a simple task. One of the problem is that the program needs this to be done in realtime, so I can't use a "complex algorithm" to calculate these properties. And on top of that, Qt doesn't provide a proper way to draw a text along a (curved) path. The best thing I could do is to try to avoid the curvatures of the streets when drawing the labels, so they will be rendered properly most of the times. A quick demonstration of what I achived this far:

OverallDue to my exams, I couldn't give a 100% priority to GSoC, but the progress this far meets my expectations and from now on, I can even double the time spent working, programming, experimenting.
I'll post a smaller status-update when I start working on the printing support improvement, until then I'm going to polish the work I had done this far.
In the end, here is a video demonstrating these improvements:

Categories: FLOSS Project Planets

Roundcube Next: The Next Steps

Planet KDE - Fri, 2015-07-03 09:17

The crowdfunding campaign to provide funding and greater community engagement around the refactoring of Roundcube's core to give it a secure future has just wrapped up. We managed to raise $103,531 from 870 people. This obviously surpassed our goal of $80,000, so we're pretty ecstatic. This is not the end, however: now we begin the journey to delivering a first release of Roundcube Next. This blog entry outines some of that path forward


The most obvious thing on our list is to get people's t-shirts and stickers out to them. We have a few hundred of them to print and ship, and it looks like we may be missing a few shipping addresses so I'll be following up with those people next week. Below is a sneak peak of what the shirts might look like. We're still working out the details, so they may look a bit different than this once they come off the presses, but this should give you an idea. We'll be in touch with people for shirt sizes, color options, etc. in the coming week.

Those who elected for the Kolaborator perk will get notified by email how to redeem your free months on Kolab Now. Of course, everyone who elected for the in-application-credits-mention will get that in due time as well. We've got you all covered! :)

Note that it takes a couple of weeks for Indiegogo to get the funds to us, and we need to waitn on that before confirming our orders and shipping for the physical perk items.

Roundcube Backstage

We'll be opening the Roundcube Backstage area in the ~2 weeks after wrap-up is complete next week. This will give us enough time to create the Backstage user accounts and get the first set of content in place. We will be using the Discourse platform for discussions and posting our weekly Backstage updates to. I'm really looking forward to reading your feedback there, answering questions, contemplating the amazing future that lays ahead of us, ...

The usual channels of Roundcube blogging, forums and mailing lists will of course remain in use, but the Backstage will see all sorts of extras and closer direct interaction with the developers. If you picked up the Backstage perk, you will get an email next week with information on when and where you can activate your account.

Advisory Committee

The advisory committee members will also be getting an email next week with a welcome note. You'll be asked to confirm who the contact person should be, and they'll get a welcome package with further information. We'll also want some information for use in the credits badge: a logo we can use, a short description you'd like to see with that logo describing your group/company, and the web address we should point people to.

The Actual Project!

The funds we raised will cover getting the new core in place with basic email, contacts and settings apps. We will be able to adopt JMap into this and build the foundations we so desperately need. The responsive UI that works on phones, tablets and desktop/laptop systems will come as a result of this work as well, something we are all really looking forward to.

Today we had an all-hands meeting to take our current requirements, mock-ups and design docs and reflect on how the feedback we received during the campaign should influence those. We are now putting all this together in a clear and concise form that we can share with everyone, particularly our Advisory Committee members as well as in the Backstage area. This will form the bases for our first round of stakeholder feedback which I am really looking forward to.

We are committed to building the most productive and collaborative community around any webmail system out there, and these are just our first steps. That we have the opportunity here to work with the likes of Fastmail and Mailpile, two entities that one may have thought of as competitors rather than possible collaborators, really shows our direction in terms of inclusivity and looking for opportunities to collaborate.

Though we are at the end of this crowdfunding phase, this is really just the beginning, and the entire team here isn't waiting a moment to get rolling! Mostly because we're too excited to do anything else ;)

Categories: FLOSS Project Planets

Midwestern Mac, LLC: Major improvements to Drupal VM - PHP 7, MariaDB, Multi-OS

Planet Drupal - Fri, 2015-07-03 09:17

For the past couple years, I've been building Drupal VM to be an extremely-tunable, highly-performant, super-simple development environment. Since MidCamp earlier this year, the project has really taken off, with almost 200 stars on GitHub and a ton of great contributions and ideas for improvement (some implemented, others rejected).

Categories: FLOSS Project Planets

Ruslan Spivak: Let’s Build A Simple Interpreter. Part 2.

Planet Python - Fri, 2015-07-03 06:43

In their amazing book “The 5 Elements of Effective Thinking” the authors Burger and Starbird share a story about how they observed Tony Plog, an internationally acclaimed trumpet virtuoso, conduct a master class for accomplished trumpet players. The students first played complex music phrases, which they played perfectly well. But then they were asked to play very basic, simple notes. When they played the notes, the notes sounded childish compared to the previously played complex phrases. After they finished playing, the master teacher also played the same notes, but when he played them, they did not sound childish. The difference was stunning. Tony explained that mastering the performance of simple notes allows one to play complex pieces with greater control. The lesson was clear - to build true virtuosity one must focus on mastering simple, basic ideas.1

The lesson in the story clearly applies not only to music but also to software development. The story is a good reminder to all of us to not lose sight of the importance of deep work on simple, basic ideas even if it sometimes feels like a step back. While it is important to be proficient with a tool or framework you use, it is also extremely important to know the principles behind them. As Ralph Waldo Emerson said:

“If you learn only methods, you’ll be tied to your methods. But if you learn principles, you can devise your own methods.”

On that note, let’s dive into interpreters and compilers again.

Today I will show you a new version of the calculator from Part 1 that will be able to:

  1. Handle whitespace characters anywhere in the input string
  2. Consume multi-digit integers from the input
  3. Subtract two integers (currently it can only add integers)

Here is the source code for your new version of the calculator that can do all of the above:

# Token types # EOF (end-of-file) token is used to indicate that # there is no more input left for lexical analysis INTEGER, PLUS, MINUS, EOF = 'INTEGER', 'PLUS', 'MINUS', 'EOF' class Token(object): def __init__(self, type, value): # token type: INTEGER, PLUS, MINUS, or EOF self.type = type # token value: non-negative integer value, '+', '-', or None self.value = value def __str__(self): """String representation of the class instance. Examples: Token(INTEGER, 3) Token(PLUS '+') """ return 'Token({type}, {value})'.format( type=self.type, value=repr(self.value) ) def __repr__(self): return self.__str__() class Interpreter(object): def __init__(self, text): # client string input, e.g. "3 + 5", "12 - 5", etc self.text = text # self.pos is an index into self.text self.pos = 0 # current token instance self.current_token = None self.current_char = self.text[self.pos] def error(self): raise Exception('Error parsing input') def advance(self): """Advance the 'pos' pointer and set the 'current_char' variable.""" self.pos += 1 if self.pos > len(self.text) - 1: self.current_char = None # Indicates end of input else: self.current_char = self.text[self.pos] def skip_whitespace(self): while self.current_char is not None and self.current_char.isspace(): self.advance() def integer(self): """Return a (multidigit) integer consumed from the input.""" result = '' while self.current_char is not None and self.current_char.isdigit(): result += self.current_char self.advance() return int(result) def get_next_token(self): """Lexical analyzer (also known as scanner or tokenizer) This method is responsible for breaking a sentence apart into tokens. """ while self.current_char is not None: if self.current_char.isspace(): self.skip_whitespace() continue if self.current_char.isdigit(): return Token(INTEGER, self.integer()) if self.current_char == '+': self.advance() return Token(PLUS, '+') if self.current_char == '-': self.advance() return Token(MINUS, '-') self.error() return Token(EOF, None) def eat(self, token_type): # compare the current token type with the passed token # type and if they match then "eat" the current token # and assign the next token to the self.current_token, # otherwise raise an exception. if self.current_token.type == token_type: self.current_token = self.get_next_token() else: self.error() def expr(self): """Parser / Interpreter expr -> INTEGER PLUS INTEGER expr -> INTEGER MINUS INTEGER """ # set current token to the first token taken from the input self.current_token = self.get_next_token() # we expect the current token to be an integer left = self.current_token self.eat(INTEGER) # we expect the current token to be either a '+' or '-' op = self.current_token if op.type == PLUS: self.eat(PLUS) else: self.eat(MINUS) # we expect the current token to be an integer right = self.current_token self.eat(INTEGER) # after the above call the self.current_token is set to # EOF token # at this point either the INTEGER PLUS INTEGER or # the INTEGER MINUS INTEGER sequence of tokens # has been successfully found and the method can just # return the result of adding or subtracting two integers, # thus effectively interpreting client input if op.type == PLUS: result = left.value + right.value else: result = left.value - right.value return result def main(): while True: try: # To run under Python3 replace 'raw_input' call # with 'input' text = raw_input('calc> ') except EOFError: break if not text: continue interpreter = Interpreter(text) result = interpreter.expr() print(result) if __name__ == '__main__': main()

Save the above code into the calc2.py file or download it directly from GitHub. Try it out. See for yourself that it works as expected: it can handle whitespace characters anywhere in the input; it can accept multi-digit integers, and it can also subtract two integers as well as add two integers.

Here is a sample session that I ran on my laptop:

$ python calc2.py calc> 27 + 3 30 calc> 27 - 7 20 calc>

The major code changes compared with the version from Part 1 are:

  1. The get_next_token method was refactored a bit. The logic to increment the pos pointer was factored into a separate method advance.
  2. Two more methods were added: skip_whitespace to ignore whitespace characters and integer to handle multi-digit integers in the input.
  3. The expr method was modified to recognize INTEGER -> MINUS -> INTEGER phrase in addition to INTEGER -> PLUS -> INTEGER phrase. The method now also interprets both addition and subtraction after having successfully recognized the corresponding phrase.

In Part 1 you learned two important concepts, namely that of a token and a lexical analyzer. Today I would like to talk a little bit about lexemes, parsing, and parsers.

You already know about tokens. But in order for me to round out the discussion of tokens I need to mention lexemes. What is a lexeme? A lexeme is a sequence of characters that form a token. In the following picture you can see some examples of tokens and sample lexemes and hopefully it will make the relationship between them clear:

Now, remember our friend, the expr method? I said before that that’s where the interpretation of an arithmetic expression actually happens. But before you can interpret an expression you first need to recognize what kind of phrase it is, whether it is addition or subtraction, for example. That’s what the expr method essentially does: it finds the structure in the stream of tokens it gets from the get_next_token method and then it interprets the phrase that is has recognized, generating the result of the arithmetic expression.

The process of finding the structure in the stream of tokens, or put differently, the process of recognizing a phrase in the stream of tokens is called parsing. The part of an interpreter or compiler that performs that job is called a parser.

So now you know that the expr method is the part of your interpreter where both parsing and interpreting happens - the expr method first tries to recognize (parse) the INTEGER -> PLUS -> INTEGER or the INTEGER -> MINUS -> INTEGER phrase in the stream of tokens and after it has successfully recognized (parsed) one of those phrases, the method interprets it and returns the result of either addition or subtraction of two integers to the caller.

And now it’s time for exercises again.

  1. Extend the calculator to handle multiplication of two integers
  2. Extend the calculator to handle division of two integers
  3. Modify the code to interpret expressions containing an arbitrary number of additions and subtractions, for example “9 - 5 + 3 + 11”

Check your understanding.

  1. What is a lexeme?
  2. What is the name of the process that finds the structure in the stream of tokens, or put differently, what is the name of the process that recognizes a certain phrase in that stream of tokens?
  3. What is the name of the part of the interpreter (compiler) that does parsing?

I hope you liked today’s material. In the next article of the series you will extend your calculator to handle more complex arithmetic expressions. Stay tuned.

And here is a list of books I recommend that will help you in your study of interpreters and compilers:

  1. Language Implementation Patterns: Create Your Own Domain-Specific and General Programming Languages (Pragmatic Programmers)

  2. Writing Compilers and Interpreters: A Software Engineering Approach

  3. Modern Compiler Implementation in Java

  4. Modern Compiler Design

  5. Compilers: Principles, Techniques, and Tools (2nd Edition)

BTW, I’m writing a book “Let’s Build A Web Server: First Steps” that explains how to write a basic web server from scratch. You can get a feel for the book here, here, and here. Subscribe to the mailing list to get the latest updates about the book and the release date.


  1. The 5 Elements of Effective Thinking 

Categories: FLOSS Project Planets

Drupal core announcements: Recording from July 3rd 2015 Drupal 8 critical issues discussion

Planet Drupal - Fri, 2015-07-03 06:19

This was our 6th critical issues discussion meeting to be publicly recorded in a row. (See all prior recordings). Here is the recording of the meeting video and chat from today in the hope that it helps more than just those who were on the meeting:

If you also have significant time to work on critical issues in Drupal 8 and we did not include you, let me know as soon as possible.

The meeting log is as follows (all times are CEST real time at the meeting):

[11:06am] alexpott: https://www.drupal.org/node/2280965
[11:06am] Druplicon: https://www.drupal.org/node/2280965 => [meta] Remove or document every SafeMarkup::set() call [#2280965] => 90 comments, 13 IRC mentions
[11:06am] alexpott: https://www.drupal.org/node/2506581
[11:06am] Druplicon: https://www.drupal.org/node/2506581 => Remove SafeMarkup::set() from Renderer::doRender [#2506581] => 49 comments, 9 IRC mentions
[11:07am] alexpott: https://www.drupal.org/node/2506195
[11:07am] Druplicon: https://www.drupal.org/node/2506195 => Remove SafeMarkup::set() from XSS::filter() [#2506195] => 40 comments, 3 IRC mentions
[11:09am] dawehner: https://www.drupal.org/node/2502785
[11:09am] Druplicon: https://www.drupal.org/node/2502785 => Remove support for #ajax['url'] and $form_state->setCached() for GET requests [#2502785] => 51 comments, 18 IRC mentions
[11:09am] Druplicon: dawehner: 5 hours 36 sec ago tell dawehner you might already be following https://www.drupal.org/node/1412090 but I figure you would be in favor.
[11:10am] alexpott: GaborHojtsy: do you what the you link is for the live hangout?
[11:11am] GaborHojtsy: live hangout page: http://youtu.be/rz_EissgU7Q
[11:11am] GaborHojtsy: alexpott: ^^^
[11:11am] GaborHojtsy: to watch that is
[11:11am] alexpott: GaborHojtsy: thanks!
[11:12am] plach: https://www.drupal.org/node/2453153
[11:12am] Druplicon: https://www.drupal.org/node/2453153 => Node revisions cannot be reverted per translation [#2453153] => 159 comments, 58 IRC mentions
[11:12am] plach: https://www.drupal.org/node/2453175
[11:12am] Druplicon: https://www.drupal.org/node/2453175 => Remove EntityFormInterface::validate() and stop using button-level validation by default in entity forms [#2453175] => 79 comments, 9 IRC mentions
[11:14am] plach: https://www.drupal.org/node/2478459
[11:14am] Druplicon: https://www.drupal.org/node/2478459 => FieldItemInterface methods are only invoked for SQL storage and are inconsistent with hooks [#2478459] => 112 comments, 29 IRC mentions
[11:14am] larowlan: https://www.drupal.org/node/2354889
[11:14am] Druplicon: https://www.drupal.org/node/2354889 => Make block context faster by removing onBlock event and replace it with loading from a BlockContextManager [#2354889] => 100 comments, 19 IRC mentions
[11:15am] larowlan: https://www.drupal.org/node/2421503
[11:16am] Druplicon: https://www.drupal.org/node/2421503 => SA-CORE-2014-002 forward port only checks internal cache [#2421503] => 46 comments, 11 IRC mentions
[11:16am] larowlan: https://www.drupal.org/node/2512460
[11:16am] Druplicon: https://www.drupal.org/node/2512460 => "Translate user edited configuration" permission needs to be marked as restricted [#2512460] => 20 comments, 8 IRC mentions
[11:17am] WimLeers: catch: pfrenssen is likely gonna be working on the "numerous paramconverters" issue
[11:17am] WimLeers: the issue catch talked about: https://www.drupal.org/node/2512718
[11:17am] Druplicon: https://www.drupal.org/node/2512718 => Numerous ParamConverters in core break the route / url cache context [#2512718] => 22 comments, 12 IRC mentions
[11:17am] catch: WimLeers: did you discuss last night's discussion with him already?
[11:18am] WimLeers: https://www.drupal.org/project/issues/search/drupal?project_issue_follow...
[11:19am] WimLeers: https://www.drupal.org/node/2450993
[11:19am] Druplicon: https://www.drupal.org/node/2450993 => Rendered Cache Metadata created during the main controller request gets lost [#2450993] => 132 comments, 27 IRC mentions
[11:20am] pfrenssen: catch: WimLeers: yes I'm working on that, for the moment just studying how it works
[11:20am] berdir: WimLeers: I think we can also demote it if all the other must issues are critical on their own
[11:20am] GaborHojtsy: https://www.drupal.org/node/2512460
[11:20am] Druplicon: https://www.drupal.org/node/2512460 => "Translate user edited configuration" permission needs to be marked as restricted [#2512460] => 20 comments, 9 IRC mentions
[11:20am] catch: pfrenssen: cool. I'll be around most of today so just ping if you want to discuss.
[11:21am] GaborHojtsy: https://www.drupal.org/node/2512466
[11:21am] Druplicon: https://www.drupal.org/node/2512466 => Config translation needs to be validated on input for XSS (like other t string input) [#2512466] => 34 comments, 3 IRC mentions
[11:21am] WimLeers: pfrenssen++
[11:21am] WimLeers: berdir: assuming you're talking about https://www.drupal.org/node/2429287 — then yes, that's exactly the plan :)
[11:21am] Druplicon: https://www.drupal.org/node/2429287 => [meta] Finalize the cache contexts API & DX/usage, enable a leap forward in performance [#2429287] => 112 comments, 10 IRC mentions
[11:21am] GaborHojtsy: https://www.drupal.org/node/2489024
[11:21am] Druplicon: https://www.drupal.org/node/2489024 => Arbitrary code execution via 'trans' extension for dynamic twig templates (when debug output is on) [#2489024] => 43 comments, 12 IRC mentions
[11:22am] GaborHojtsy: https://www.drupal.org/node/2512718
[11:22am] Druplicon: https://www.drupal.org/node/2512718 => Numerous ParamConverters in core break the route / url cache context [#2512718] => 22 comments, 13 IRC mentions
[11:22am] xjm: GaborHojtsy: I took care of updating the security polict with mlhess
[11:22am] xjm: GaborHojtsy: that's done
[11:23am] • xjm listening to the meeting but cannot join because of 10 person limit
[11:23am] GaborHojtsy: xjm: yay
[11:23am] GaborHojtsy: xjm++
[11:24am] WimLeers: Yes
[11:24am] WimLeers: I DO HAVE A MICROPHONE!!!
[11:24am] WimLeers: :P
[11:24am] xjm: GaborHojtsy: https://www.drupal.org/node/475848/revisions/view/7267195/8630716 now restrict access is mentioned explicitly, so any perm that has it is covered
[11:24am] Druplicon: https://www.drupal.org/node/475848 => Security advisories process and permissions policy => 0 comments, 1 IRC mention
[11:25am] larowlan: https://www.drupal.org/node/2509898
[11:25am] Druplicon: https://www.drupal.org/node/2509898 => Additional uncaught exception thrown while handling exception after service changes [#2509898] => 22 comments, 5 IRC mentions
[11:25am] alexpott: WimLeers: a working microphone :)
[11:28am] WimLeers: alexpott: all of you guys are breaking up for me from time to time. It looks like it's something with Chrome/Hangouts :(
[11:28am] xjm: WimLeers: yeah I had to use FF
[11:28am] WimLeers: My mic *works* if I test it locally.
[11:28am] WimLeers: xjm: lol, the beautiful irony
[11:30am] WimLeers: berdir: alexpott: I checked and I agree with the change you guys just asked me feedback on: https://www.drupal.org/node/2375695#comment-10082188
[11:30am] Druplicon: https://www.drupal.org/node/2375695 => Condition plugins should provide cache contexts AND cacheability metadata needs to be exposed [#2375695] => 117 comments, 40 IRC mentions
[11:32am] alexpott: https://www.drupal.org/node/2497243
[11:32am] Druplicon: https://www.drupal.org/node/2497243 => Rebuilding service container results in endless stampede [#2497243] => 97 comments, 22 IRC mentions
[11:33am] • larowlan using FF too, doesn't trust Google
[11:33am] alexpott: dawehner, catch: anyone got the nid of the chx container issue?
[11:33am] dawehner: https://www.drupal.org/node/2513326
[11:33am] catch: alexpott: https://www.drupal.org/node/2513326
[11:33am] Druplicon: https://www.drupal.org/node/2513326 => Performance: create a PHP storage backend directly backed by cache [#2513326] => 43 comments, 13 IRC mentions
[11:33am] Druplicon: https://www.drupal.org/node/2513326 => Performance: create a PHP storage backend directly backed by cache [#2513326] => 43 comments, 14 IRC mentions
[11:34am] catch: beat me.
[11:36am] larowlan: https://www.drupal.org/node/2511568 for the ctools issue I mentioned
[11:36am] Druplicon: https://www.drupal.org/node/2511568 => Create "context stack" service where available contexts can be registered [#2511568] => 0 comments, 3 IRC mentions
[11:41am] xjm: dropping off now to walk to the venue
[11:41am] alexpott: pfrenssen++
[11:50am] GaborHojtsy: for browser we are working on https://www.drupal.org/node/2430335
[11:50am] Druplicon: https://www.drupal.org/node/2430335 => Browser language detection is not cache aware [#2430335] => 47 comments, 21 IRC mentions
[11:50am] GaborHojtsy: Fabianx-screen: see above
[11:51am] berdir: and it's not a blocker
[11:51am] berdir: because right now it just kills page/smart cache
[11:51am] berdir: so if you use that, you are not seeing a cached page anyway
[11:51am] berdir: Fabianx-screen: we have a cache context for the content language
[11:52am] xjm: the youtube video has like a 20 second delay
[11:53am] GaborHojtsy: xjm: we get what we pay for :D
[11:54am] xjm: now I am talking
[11:55am] plach: https://www.drupal.org/node/2453175#comment-10080242
[11:55am] Druplicon: https://www.drupal.org/node/2453175 => Remove EntityFormInterface::validate() and stop using button-level validation by default in entity forms [#2453175] => 79 comments, 10 IRC mentions
[11:55am] xjm: more like 60s
[11:55am] GaborHojtsy: xjm: well, you’ll be in the recording forver now :D
[11:55am] xjm: :P
[11:56am] xjm: GaborHojtsy: is there any way to increase the number of participants or to include the chat sidebar in the video?
[11:56am] WimLeers1: catch: Can you leave a comment on https://www.drupal.org/node/2512718 with your POV/conclusion — sounds like you have the clearest grasp on this/completest view on the likely solution.
[11:56am] Druplicon: https://www.drupal.org/node/2512718 => Numerous ParamConverters in core break the route / url cache context [#2512718] => 23 comments, 14 IRC mentions
[11:56am] WimLeers1: xjm: chat is here, we don't use Google Hangout's chat sidebar
[11:56am] GaborHojtsy: xjm: the chat window is THIS ONE
[11:57am] larowlan: xjm: the chat one is lost soon as the call ends
[11:57am] xjm: WimLeers: GaborHojtsy: so the thing is that listening to the video it's very hard to figure out which issues people are discussing, even with Gábor's notes
[11:57am] larowlan: xjm: you can get more than 10 in the call if you have a corporate account I think
[11:57am] WimLeers: larowlan: Gábor copy/pastes this chat to the g.d.o post with the link of the recording
[11:57am] • larowlan nods
[11:57am] WimLeers: that was meant for xjm, oops
[11:58am] GaborHojtsy: xjm: the trick is if we would start on time, then my chat log shows timestamps which are sync with the video
[11:58am] GaborHojtsy: xjm: we have a few minute delay between the video start and the top of the hour
[11:58am] catch: WimLeers: yep will do.
[11:59am] GaborHojtsy: xjm: because people arrive later basically :)
[11:59am] WimLeers: catch++
[12:01pm] GaborHojtsy: xjm: its also hard to get Druplicon in google hangouts — as in impossible ÉD
[12:01pm] GaborHojtsy: xjm: and there was some interest for people watching to be able to follow links WHILE the meeting was on
[12:02pm] xjm: GaborHojtsy: but it doesn't work -- I'm trying that right now -- the delay is too big
[12:02pm] WimLeers: alexpott: regarding in title: https://api.drupal.org/comment/26#comment-26
[12:03pm] GaborHojtsy: xjm: well, then at least the Druplicon advantage is there :)
[12:03pm] GaborHojtsy: xjm: I am happy to use some way to get the comments in if there is one
[12:04pm] GaborHojtsy: xjm: I am not sure it helps if eg. someone shares their screen with an IRC client throughout the video
[12:04pm] GaborHojtsy: xjm: the links are not clickable either
[12:04pm] xjm: GaborHojtsy: shannon did that for the meeting we had 2y ago -- though the same problem with links not being clickable
[12:04pm] GaborHojtsy: xjm: but that may be a workaround

Categories: FLOSS Project Planets

DrupalCon News: We're making it easy to give back. Add a membership to your cart.

Planet Drupal - Fri, 2015-07-03 05:00

If you have been to a DrupalCon before, you will notice something new in the registration process. We've made it really easy for you to sign up for Drupal Association membership and bundle it into your overall DrupalCon purchase.

Categories: FLOSS Project Planets

GSoC ’15 Post #3: Install-ed!

Planet KDE - Fri, 2015-07-03 03:07

After familiarising myself with PackageKit-Qt last week, I started working on a small application that uses it this week. The aim was simple – to create an application that uses PackageKit to install packages. Thanks to detailed guides here (PackageKit reference) and here (PackageKit-Qt API docs) pointed to by ximion, I was able to build a KF5 application that takes user input, asks for password and installs the application the user typed in. The application can be found here (git).

The Application

The application has a simple interface – a lineEdit and two pushButtons.
Once the user input has been stored into a QString variable (this is the package name), the next step is to resolve the name to a package ID. The package ID is basically the package name with some more data (related to the system on which the package is being installed). For example, package ID for the package geany turns out to be

geany;1.24.1+dfsg-1build1;amd64;vivid geany;1.24.1+dfsg-1build1;i386;vivid

To resolve the package names to package IDs, PackageKit provides a function named resolve. Resolve emits package IDs, which when fed to the function packageInstall installs the packages on the system. That’s it. All you need to know to build your application is the functions and what they emit.

Next Up

Now, I have all the tools ready to start working on the applications.

Next, I’m working on Dolphin to integrate PackageKit into it to install Samba. I have run into some building issues, but hopefully they’ll be solved soon and once that’s done, I’ll just have to replicate what I did in the above application there.

Categories: FLOSS Project Planets

InternetDevels: Drupal 7 application with PhoneGap: full tutorial

Planet Drupal - Fri, 2015-07-03 03:06

Mobile application development is a relatively new area of Drupal development services, but it is advancing rapidly, because mobile devices are used increasingly.

In terms of adapting websites for mobile platforms, PhoneGap and technologies built on it remain very popular. So let’s see what it is and how it works.

Read more
Categories: FLOSS Project Planets

Petter Reinholdtsen: Time to find a new laptop, as the old one is broken after only two years

Planet Debian - Fri, 2015-07-03 01:10

My primary work horse laptop is failing, and will need a replacement soon. The left 5 cm of the screen on my Thinkpad X230 started flickering yesterday, and I suspect the cause is a broken cable, as changing the angle of the screen some times get rid of the flickering.

My requirements have not really changed since I bought it, and is still as I described them in 2013. The last time I bought a laptop, I had good help from prisjakt.no where I could select at least a few of the requirements (mouse pin, wifi, weight) and go through the rest manually. Three button mouse and a good keyboard is not available as an option, and all the three laptop models proposed today (Thinkpad X240, HP EliteBook 820 G1 and G2) lack three mouse buttons). It is also unclear to me how good the keyboard on the HP EliteBooks are. I hope Lenovo have not messed up the keyboard, even if the quality and robustness in the X series have deteriorated since X41.

I wonder how I can find a sensible laptop when none of the options seem sensible to me? Are there better services around to search the set of available laptops for features? Please send me an email if you have suggestions.

Categories: FLOSS Project Planets

2bits: Backdrop: an alternative Drupal fork

Planet Drupal - Thu, 2015-07-02 23:16
Last week, at the amazing Drupal North regional conference, I gave a talk on Backdrop: an alternative fork of Drupal. The slides from the talk are attached below, in PDF format.
Categories: FLOSS Project Planets

Drupal core announcements: Portsmouth NH theme system critical sprint recap

Planet Drupal - Thu, 2015-07-02 23:16

In early June a Drupal 8 theme system critical issues sprint was held in Portsmouth, New Hampshire as part of the D8 Accelerate program.

The sprint started the afternoon of June 5 and continued until midday June 7.

Sprint goals

We set out to move forward the two (at the time) theme system criticals, #2273925: Ensure #markup is XSS escaped in Renderer::doRender (created May 24, 2014) and #2280965: [meta] Remove or document every SafeMarkup::set() call (created June 6, 2014).


The Drupal Association provided the D8 Accelerate grant which covered travel costs for joelpittet and Cottser.

Bowst provided the sprint space.

As part of its NHDevDays series of contribution sprints, the New Hampshire Drupal Group provided snacks and refreshments during the sprint, lunch and even dinner on Saturday.

Digital Echidna provided time off for Cottser.


xjm committed #2273925: Ensure #markup is XSS escaped in Renderer::doRender Sunday afternoon! xjm’s tweet sums things up nicely.

As for the meta (which is comprised of about 50 sub-issues), by the end of the sprint we had patches on over 30 of them, 3 had been committed, and 7 were in the RTBC queue.

Thanks to the continued momentum provided by the New Jersey sprint, as of this writing approximately 20 issues from the meta issue have been resolved.

Friday afternoon

peezy kicked things off with a brief welcome and acknowledgements. joelpittet and Cottser gave an informal introduction to the concepts and tasks at hand for the sprinters attending.

After that, leslieg on-boarded our Friday sprinters (mostly new contributors), getting them set up with Drupal 8, IRC, Dreditor, and so on. leslieg and a few others then went to work reviewing documentation around #2494297: [no patch] Consolidate change records relating to safe markup and filtering/escaping to ensure cross references exist.

Meanwhile in "critical central" (what we called the meeting room where the work on the critical issues was happening)…

lokapujya and joelpittet got to work on the remaining tasks of #2273925: Ensure #markup is XSS escaped in Renderer::doRender.

cwells and Cottser started the work on removing calls to SafeMarkup::set() by working on #2501319: Remove SafeMarkup::set in _drupal_log_error, DefaultExceptionSubscriber::onHtml, Error::renderExceptionSafe.

Thai food was ordered in, and many of us continued working on issues late into the evening.


joelpittet and Cottser gave another brief introduction to keep new arrivals on the same page and reassert concepts from the day before.

leslieg did some more great on-boarding Saturday and worked with a handful of new contributors on implementing #2494297: [no patch] Consolidate change records relating to safe markup and filtering/escaping to ensure cross references exist. The idea was that by reviewing and working on this documentation the contributors would be better equipped to work directly on the issues in the SafeMarkup::set() meta.

Mid-morning Cottser led a participatory demo with the whole group of a dozen or so sprinters, going through one of the child issues of the meta and ending up with a patch. This allowed us to walk through the whole process and think out loud the whole time.

The Benjamin Melançon XSS attack in action. Having some fun while working on our demo issue.

By this time we had identified some common patterns after working on enough of these issues.

By the end of Saturday all of the sprinters including brand new contributors were collaborating on issues from the critical meta and the issue stickies were flying around the room with fervor (a photo of said issue stickies is below).

Then we had dinner :)

Sunday morning

drupal.org was down for a while.

We largely picked up where we left off Saturday, cranked out more patches, and joelpittet and Cottser started to review the work that had been done the day before that was in the “Needs Human” column.

Our sprint board looked something like this on the last day of the sprint.

Thank you

Thanks to the organizing committee (peezy, leslieg, cwells, and kbaringer), xjm, effulgentsia, New Hampshire DUG, Seacoast DUG, Bowst, Drupal Association, Digital Echidna, and all of our sprinters: cdulude, Cottser, cwells, Daniel_Rose, dtraft, jbradley428, joelpittet, kay_v, kbaringer, kfriend, leslieg, lokapujya, mlncn, peezy, sclapp, tetranz.

AttachmentSize 20150606_114556.jpg453.91 KB 20150606_184029.jpg549.46 KB 20150607_130317.jpg353.87 KB IMG_8589.JPG781.47 KB
Categories: FLOSS Project Planets

New Category, KDE

Planet KDE - Thu, 2015-07-02 20:00

Since I am an active developer of open-source project called digiKam, which is a part of KDE suite, I’m adding a new category on my blog, which will keep you up to date to my latest development on digiKam project.

Stay tuned and watch for KDE category.

Categories: FLOSS Project Planets

GSoC Midterm, Advanced Metadata Hub

Planet KDE - Thu, 2015-07-02 20:00

This is my 3rd consecutive year when I participate in Google Summer of Code program. All three years I have been working on digiKam, an open source program for image management. digiKam offer extensive tools for managing image collections such as tagging, quality sorting and managing image metadata. Metadata is usually human readable strings embedded into image.

The project that I’m currently working on is called Advanced Metadata Hub. Until now, digiKam was able to read and write common metadata such as ratings, comments and tags into image. To avoid confusion, metadata is structured in key, values pairs, so each program will know where to read and what information to expect.

Until know, the keys were hard coded in digiKam sources and when user request for a new metadata to be supported, that list must be manually extended by developer.

I’m working on building a settings menu, which will allow users to manage where the metadata is being written and also change some of default behavoir:

  • Changing the order of where digiKam should search for data
  • Unify read and write
  • Disable or enable some keys

Due to midterm evaluation I was able to implement a good part of settings menu:

And some changes for underlying classes to save and load changes. The project seem simple, but there are some tricky parts:

  • Different keys need a special wrapping of information
  • Storing tags, you must take care to put both paths and keywords
  • Non unified read and write of metadata can lead to inconsistent behaviour

I’m looking forward to see the final result, and I hope it will be an amazing feature.

Categories: FLOSS Project Planets

Sumith: GSoC Progress - Week 6

Planet Python - Thu, 2015-07-02 20:00

Hello, received a mail few minutes into typing this, passed the midterm review successfully :)
Just left me wondering how do these guys process so many evaluations so quickly.
I do have to confirm with Ondřej about this.
Anyways, the project goes on and here is my this week's summary.


SymEngine successfully moved to using Catch as a testing framework.

The travis builds for clang were breaking, this let me to play around with travis and clang builds to fix this issue. The linux clang build used to break because we used to mix-up and link libraries like GMP compiled with different standard libraries.
Thanks to Isuru for lending a helping hand and fixing it in his PR.

Next task to make SYMENGINE_ASSERT not use standard assert(), hence I wrote my custom assert which simulates the internal assert.
Now we could add the DNDEBUG as a release flag when Piranha is a dependence, this was also done.

Started work on Expression wrapper, PR that starts off from Francesco's work sent in.

Investigated the slow down in benchmarks that I have been reporting in the last couple of posts. Using git commit(amazing tool, good to see binary search in action!), the first bad commit was tracked. We realized that the inclusion of piranha.hpp header caused the slowdown and was resolved by using mp_integer.hpp, just the requirement header.
With immense help of Franceso, the problem was cornered to this:
* Inclusion of thread_pool leads to the slowdown, a global variable that it declares to be specific.
* In general a multi-threaded application may result in some compiler optimizations going off, hence slowdown.
* Since this benchmark is memory allocation intensive, another speculation is that compiler allocates memory differently.

This SO question asked by @bluescarni should lead to very interesting developments.

We have to investigate this problem and get it sorted. Not only because we depend on Piranha, we might also have multi-threading in SymEngine later too.


No benchmarking was done this week.
Here is my PR reports.

* #500 - Expression Wrapper

* #493 - The PR with Catch got merged.
* #498 - Made SYMENGINE_ASSERT use custom assert instead of assert() and DNDEBUG as a release flag with PIRANHA.
* #502 - Make poly_mul used mpz_addmul (FMA), nice speedup of expand2b. * #496 - En route to fixing SYMENGINE_ASSERT led to a minor fix in one of the assert statements.
* #491 - Minor fix in compiler choice documentation.

Targets for Week 6
  • Get the Expression class merged.
  • Investigate and fix the slow-downs.

The rest of tasks can be finalized in later discussion with Ondřej.

That's all this week.

Categories: FLOSS Project Planets

Pointing devices KCM: update #2

Planet KDE - Thu, 2015-07-02 19:00

For general information about the project, look at this post

Originally I planned to work on the KCM UI at this time. But as I am unsure how it should look like, I started a discussion on VDG forum, and decided to switch to other tasks.

Currently the KCM looks like this:

Don’t worry, it isn’t the final UI, just a minimal demo :)

KDED module is, I think, almost complete. It can apply settings from configuration file, and has a method exported to D-Bus to reload configuration for all devices or for some specific device. Of course, it also applies settings immediately when device is plugged in. The only thing that is missing is auto-disabling of some devices (like disable touchpad when there’s an external mouse).

As usual, here is a link to the repository.

Also, I started working on a D-Bus API for KWin. The API will expose most of libinput configuration settings. Currently it lists all available devices, some of their most important read-only properties (like name, hardware ids, capabilities), and allows to enable/disable tap-to-click as an example of writable property. As I already said, KCM isn’t yet ready, but I was able to enable tap-to-click on my touchpad using qdbusviewer.

My kwin repo clone is here, branch libinput-dbusconfig

Categories: FLOSS Project Planets

Enrico Zini: italian-fattura-elettronica

Planet Debian - Thu, 2015-07-02 17:48
Billing an Italian public administration

Here's a simple guide for how I managed to bill one of my customers as is now mandated by law in Italy.

Create a new virtualbox machine

I would never do any of this to any system I would ever want to use for anything else, so it's virtual machine time.

  • I started virtualbox, created a new machine for Ubuntu 32bit, 8Gb disk, 4Gb RAM, and placed the .vdi image in an encrypted partition. The web services of Infocert's fattura-pa requires "Java (JRE) a 32bit di versione 1.6 o superiore".
  • I installed Ubuntu 12.04 on it: that is what dike declares to support.
  • I booted the VM, installed virtualbox-guest-utils, and de sure I also had virtualbox-guest-x11
  • I restarted the VM so that I could resize the virtualbox window and have Ubuntu resize itself as well. Now I could actually read popup error messages in full.
  • I changed the desktop background to something that gave me the idea that this is an untrusted machine where I need to be very careful of what I type. I went for bright red.
Install smart card software into it
  • apt-get install pcscd pcsc-tools opensc
  • In virtualbox, I went to Devices/USB devices and enabled the smart card reader in the virtual machine.
  • I ran pcsc_scan to see if it could see my smart card.
  • I ran Firefox, went to preferences, advanced, security devices, load. Module name is "CRS PKCS#11", module path is /usr/lib/opensc-pkcs11.so
  • I went to https://fattura-pa.infocamere.it/fpmi/service and I was able to log in. To log in, I had to type the PIN 4 times into popups that offered little explanations about what was going on, enjoying cold shivers because the smart card would lock itself at the 3rd failed attempt.
  • Congratulations to myself! I thought that all was set, but unfortunately, at this stage, I was not able to do anything else except log into the website.
Descent into darkness Set up things for fattura-pa
  • I got the PDF with the setup instructions from here. Get it too, for a reference, a laugh, and in case you do not believe the instructions below.
  • I went to https://www.firma.infocert.it/installazione/certificato.php, and saved the two certificates.
  • Firefox, preferences, advanced, show certificates, I imported both CA certificates, trusted for everything, all my base are belong to them.
  • apt-get install icedtea-plugin
  • I went to https://fattura-pa.infocamere.it/fpmi/service and tried to sign. I could not: I got an error about invalid UTF8 for something or other in Firefox's stdandard error. Firefox froze and had to be killed.
Set up things for signing locally with dike
  • I removed icedtea so that I could use the site without firefox crashing.
  • I installed DiKe For Ubuntu 12.04 32bit
  • I ran dikeutil to see if it could talk to my smart card
  • When signing with the website, I chose the manual signing options and downloaded the zip file with the xml to be signed.
  • I got a zip file, unzipped it.
  • I loaded the xml into dike.
  • I signed it with dike.
  • I got this error message: "nessun certificato di firma presente sul dispositivo di firma" and then this error message: "Impossibile recuperare il certificato dal dispositivo di firma". No luck.
Set up things for signing locally with ArubaSign
  • I went to https://www.pec.it/Download.aspx
  • I downloaded ArubaSign for Linux 32 bit.
  • Oh! People say that it only works with Oracle's version of Java.
  • sudo add-apt-repository ppa:webupd8team/java
  • apt-get update
  • apt-get install oracle-java7-installer
  • During the installation process I had to agree to also sell my soul to Oracle.
  • tar axf ArubaSign*.tar*
  • cd ArubaSing-*/apps/dist
  • java -jar ArubaSign.jar
  • I let it download its own updates. Another time I did not. It does not seem to matter: I get asked that question every time I start it anyway.
  • I enjoyed the fancy brushed metal theme, and had an interesting time navigating an interface where every label on every icon or input field was truncated.
  • I downloaded https://www.pec.it/documenti/Manuale_ArubaSign2_firma%20Remota_V03_02_07_2012.pdf to get screenshots of that interface with all the labels intact
  • I signed the xml that I got from the website. I got told that I needed to really view carefully what I was signing, because the signature would be legally binding
  • I enjoyed carefully reading a legally binding, raw XML file.
  • I told it to go ahead, and there was now a .p7m file ready for me. I rejoiced, as now I might, just might actually get paid for my work.
Try fattura-pa again

Maybe fattura-pa would work with Oracle's Java plugin?

  • I went to https://fattura-pa.infocamere.it/fpmi/service
  • I got asked to verify java at www.java.com. I did it.
  • I told FireFox to enable java.
  • Suddenly, and while I was still in java.com's tab, I got prompted about allowing Infocert's applet to run: I allowed it to run.
  • I also got prompted several times, still while the current tab was not even Infocert's tab, about running components that could compromise the security of my system. I allowed and unblocked all of them.
  • I entered my PIN.
  • Congratulations! Now I have two ways of generating legally binding signatures with government issued smart cards!

I shut down that virtual machine and I'm making sure I never run anything important on it. Except, of course, generating legally binding signatures as required by the Italian government.

What could possibly go wrong?

Categories: FLOSS Project Planets

James Mills: A Docker-based mini-PaaS

Planet Python - Thu, 2015-07-02 17:20
The Why

So by now everyone has heard of Docker right? (If not, you have some catching up to do!)

Why have I created this mini-PaaS based around Docker? What's wrong with the many myriad of platforms and services out there:

Well. Nothing! The various platforms, services and stacks that exist to service, deploy, monitor applications using Docker all have their use-cases and pros and cons.

If you call the post Flynn vs. Deis: The Tale of Two Docker Micro-PaaS Technologies i said the following about ~10months ago.

I've stuck by this and 10 months later here it is.


autodock: image: prologic/autodock ports: - "1338:1338/udp" - "1338:1338/tcp" volumes: - /var/run/docker.sock:/var/run/docker.sock autodocklogger: image: prologic/autodock-logger links: - autodock autodockhipache: image: prologic/autodock-hipache links: - autodock - hipache:redis hipache: image: hipache ports: - 80:80 - 443:443

Gist here: https://gist.github.com/prologic/72ca4076a63d5dd1687d

This uses the following tools and software (all which I wrote as well):

Now. Here's the thing. Nothing here is particularly fancy.

  • There's no DNS management
  • There's no fancy services to speak of
  • There's no web interface at all.
  • There's no API or even a CLI tool

So what is there?

Basically this works in a very simple way.

  1. Setup a wildcard A record on a domain pointing it at your Docker host.
  2. Spin up containers with the -e VIRTUALHOST environment variable.

That's it!

The How

How this works:

  • autodock is a daemon that listens for Docker events via the Docker Remote API
  • autodock is pluggable and provides a UDP-based distributed interface to other plugins.
  • When autodock sees a Docker event it broadcasts it to all nodes.
  • When autodock-hipache sees container start/stop/died/killed/paused/unpaused events it: - Checks for a VIRTUALHOST environment variable. - Checks for a PORT environment variable (optional, default: ``80``). - Checks the configuration of the container for exposed ports. - If PORT is a valid exposed port; reconfigure hipache with the provided virtualhost from the VIRTUALHOST environment variable routing web requests to the container's ip address and port given by PORT.

Using this is quite simple. Copy the above docker-compose.yml and run:

docker-compose up -d

Then start a container:

docker run -d -e VIRTUALHOST=hello.local prologic/hello

And visit: http://hello.local

Assuming (of course) hello.local points to your Docker host in /etc/hosts.

Of course real deployments of this will use real domains and a real DNS server.

Two example deployments of this can be seen here:

Enjoy! :)

Update: I have now created a new project/tool to help facilitate the setup of this little minimal PaaS. Check out autodock-paas!

Categories: FLOSS Project Planets

Fiber UI Experiments – Conclusion?

Planet KDE - Thu, 2015-07-02 17:13

It’s been one heckuva road, but I think the dust is starting to settle on the UI design for Fiber, a new web browser which I’m developing for KDE. After some back-and fourth from previous revisions, there are some exciting new ideas in this iteration! Please note that this post is about design experiments – the development status of the browser is still very low-level and won’t reach the UI stage for some time. These experiments are being done now so I can better understand the structure of the browser as I program around a heavily extension-based UI, so when I do solidify the APIs it we have a rock-solid foundation.

Just as an aside before I get started; just about any time I mention “QML”, there is the possible chance that whatever is being driven could also alternatively use HTML. I’m looking into this, but make no guarantees.

As a recap to previous experiments, one of the biggest things that became very clear from feedback was that the address bar isn’t going away and I’m not going to hide it. I was a sad panda, but there are important things the address bar provides which I just couldn’t work around. Luckily, I found some ways to improve upon the existing address bar ideology via aggressive use of extensions, and slightly different usage compared to how contemporary browsers embed extensions into the input field – so lets take a look at the current designs;

By default, Fiber will have either “Tabs on Side” or “Tabs on Bottom”; this hasn’t been decided yet, but there will also have a “Tabs on Top” option (which I have decided will not be default for a few reasons). Gone is the search box as it was in previous attempts – replaced with a proper address bar which I’m calling “Multitool” – and here’s more about it why I’m a little excited;


Fiber is going to be an extensions-based browser. Almost everything will be an extension, from basic navigational elements (back, forward), to the bookmarks system – and all will either disable-able or replaceable. This means every button, every option, every utility will be configurable. I’ve studied how other browsers embed extensions in the address bar, and none of them really integrate with it in a meaningful and clearly defined way. Multitool is instead getting a well-defined interface for extensions which make use of the bar;

Extensions which have searchable or traversable content will be candidates for extending into the Multitool, which includes URL entry, search, history, bookmarks, downloads, and other things. Since these are extensions with a well-defined API you will be able to ruthlessly configure what you want or don’t want to show up, and only the URL entry will be set in stone. Multitool extensions will have 3 modes which you can pick from: background, button, and separate.

Background extensions will simply provide additional results when typing into the address bar. By default, this will be the behaviours of things like current tabs, history, and shortcut-enabled search. Button extensions in mutitool will provide a clickable option which will take over the Multitool when clicked, offering a focused text input and an optional QML-based “home popout”. Lastly, “separateextensions will split the text input offering something similar to a separate text widget – only integrated into the address bar.

The modes and buttons will be easily configurable, and so you can choose to have extensions simply be active in the background, or you could turn on the buttons, or disable them entirely. Think of this as applying KRunner logic to a browser address bar, only with the additional ability to perform “focused searches”.

Shown on the right side of the Multitool are the two extensions with dedicated buttons; bookmarks and search, which will be the default rollout. When you click on those embedded buttons they will take over the address bar and you may begin your search. These tools will also be able to specify an optional QML file for their “home” popout. For example the Bookmarks home popout could be a speed-dial UI, History could be a time-machine-esque scrollthrough. Seen above is a speed dial popout. With Bookmarks and Search being in button mode by default, just about everything else that performs local searches will be in “background mode”, except keyword-based searches which will be enabled – but will require configuration. Generally, the address portion of Multitool will NOT out-of-box beam what you type to a 3rd party, but the search extension will. I have not selected search providers.

We also get a two-for-one deal for fast filtering, since the user is already aware they have clicked on a text entry. Once you pick a selection from a focused search or cancel, the bar will snap back into address mode. If you hit “enter” while doing a focused search, it will simply open a tab with the results of that search.

Aside from buttons, all the protocol and security information relevant to the page (the highlighted areas on the left) will also be extension-driven. Ideally, this will let you highly customise what warnings you get, and will also let extensions tie any content-altering behaviour into proper warnings. For example, the ad-blocker may broadcast the number of zapped ads. When clicked the extensions will us QML-driven popouts.

Finally, the address itself (and any focused extension searches) will have extension-driven syntax highlighting. Right now I’m thinking of using a monospace font so we can drive things like bold fonts without offsetting text.


Tab placement was a big deal to people; some loved the single-row approach, others wanted a more traditional layout. The solution to the commotion was the fact that there isn’t a single solution. Tabs will have previews and simple information (as seen in the previous round of designs), so by default tabs will be on the bottom or side so the previews don’t obstruct unnecessary amounts of UI.

Fiber will have 3 tabbing options; Tabs on top, tabs on bottom, and tabs on side. When tabs are “on side” it will reduce the UI to one toolbar and place the tabs on the same row as the Multitool, and should also trigger a “compressed” layout for Multitool as well.

There will be the usual “app tab” support of pinning tabs, but not shown here will be tab-extensions. Tab extensions will behave like either app tabs or traditional tabs, and will be QML-powered pages from extensions. These special tabs will also be home-screen or new-tab options, and that is, largely, their purpose; but clever developers may find a use in having extension-based pages.

Tabs can also embed simple toggle-buttons, as usual, powered by extensions. Main candidates for these will be mute buttons or reader-mode buttons. There won’t be much to these buttons, but they will be content-sensitive and extensions will be required to provide the logic for when these buttons should be shown. For example, “reader mode” won’t be shown on pages without articles, and “mute” won’t be shown on pages without sound.

Current Progress

The current focus in Fiber is Profiles, Manifest files, and startup. Profiles will be the same as Firefox profiles, where you can have separate profiles with separate configurations. When in an activities-enabled environment, Fiber Profiles will attempt to keep in sync with the current activity – otherwise they will fall back to having users open a profile tool.

The manifest files are a big deal, since they define how extensions will interact with the browser. Fiber manifest files were origionally based on a slimmed down Chrome manifest with more “Qt-ish” syntax (like CamelCase); but with the more extensive extension plans and placement options there’s more going on with interaction points. There’s a decent manifest class, and it provides a reliable interface to read from, including things like providing missing defaults and offering some debugging info which will be used in Fibers extension development tools.I’m using DBus for Fiber to check a few things on startup; Fiber will be a “kind of” single-instance application, but individual profiles will be separate processes. DBus is being used to speak with running instances to figure out what it should do. The idea behind this setup is to keep instances on separate activities from spiking eachother, but to still allow the easier communication between windows of a single instance – this should help things like tab dragging between windows immensely. It also gives the benefit that you could run “unstable” extensions in a separate instance, which will be good for development purposes.I wish I could say development is going quickly, but right now my time is a bit crunched; either way things are going smoothly, and I’d rather be slow and steady than fast and sloppy.Development builds will be released in the future (still a long ways away) which I’ll be calling “Copper” builds. Copper builds will mostly be a rough and dirty way for me to test UI, and will not be stable or robust browsers. Mostly, it’ll be for the purpose of identifying annoying UI patterns and nipping them before they get written into extensions.

Categories: FLOSS Project Planets
Syndicate content