So now it's in the NEW queue, along with its dependency bmusb. Let's see if I made any fatal mistakes in release preparation :-)
By popular request...
If you go to the Debian video archive, you will notice the appearance of an "lq" directory in the debconf16 subdirectory of the archive. This directory contains low-resolution re-encodings of the same videos that are available in the toplevel.
The quality of these videos is obviously lower than the ones that have been made available during debconf, but their file sizes should be up to about 1/4th of the file sizes of the full-quality versions. This may make them more attractive as a quick download, as a version for a small screen, as a download over a mobile network, or something of the sorts.
Note that the audio quality has not been reduced. If you're only interested in the audio of the talks, these files may be a better option.
The second day of TUG 2016 was again full of interesting talks spanning from user experiences to highly technical details about astrological chart drawing, and graphical user interfaces to TikZ to the invited talk by Robert Bringhurst on the Palatino family of fonts.
With all these interesting things there is only one thing to compain – I cannot get out of the dark basement and enjoy the city…
After a evening full of sake and a good night’s sleep we were ready to dive into the second day of TUG.Kaveh Bazargan – A graphical user interface for TikZ
The opening speaker of Day 2 was Kaveh. He first gave us a quick run-down on what he is doing for business and what challenges publishers are facing in these times. After that he introduced us to his new development of a command line graphical user interface for TikZ. I wrote command line on purpose, because the editing operations are short commands issued on a kind of command line, which will give an immediate graphical feedback. Basic of the technique is a simplified TikZ-like meta language that is not only easy to write, but also easy to parse.
While the amount of supported commands and features of TikZ is still quite small, I think the basic idea is a good one, and there is a good potential in it.Matthew Skala – Astrological charts with horoscop and starfont
Next up was Matthew who introduced us to the involved task of typesetting astrological charts. He included comparisons with various commercial and open source solutions, where Matthew of course, but me too, felt that his charts came of quite well!
As an extra bonus we got some charts of famous singers, as well as the TUG 2016 horoscope.David Tulett – Development of an e-textbook using LaTeX and PStricks
David reported on his project to develop an e-textbook on decision modeling (lots of math!) using LaTeX and PStricks. His e-book is of course a PDF. There were a lot of very welcoming feedback – free (CC-BY-NC-ND) textbooks for sciences are rare and we need more of them.Christian Gagné – An Emacs-based writing workflow inspired by TeX and WEB, targeting the Web
Christian’s talk turned around editing and publishing using org-mode of Emacs and the various levels of macros one can use in this setup. He finished with a largely incomprehensible vision of a future equational logic based notation mode. I have used equational logic in my day-in-day-out job, and I am not completely convinced that this is a good approach for typesetting and publishing – but who knows, I am looking forward to a more logic-based approach!Barbara Beeton, Frank Mittelbach – In memoriam: Sebastian Rahtz (1955-2016)
Frank recalled Sebastian’s many contribution to a huge variety of fields, and recalled our much missed colleague with many photos and anecdotes.Jim Hefferon – A LaTeX reference manual
Jim reported about the current state of a LaTeX reference manual, which tries to provide a documentation orthogonally to the many introduction and user guides available, by providing a straight down-to-earth reference manual with all the technical bells and whistles necessary.
As I had to write myself a reference manual for a computer language, it was very interested to see how they dealt with many of the same problems I am facing.Arthur Reutenauer, Mojca Miklavec – Hyphenation past and future: hyph-utf8 and patgen
Arthur reports about the current statue of the hyphenation pattern project, and in particular the license and usage hell they recently came into with large cooperations simply grabbing the patterns without proper attribution.
In a second part he gave a rough sketch of his shot at a reimplementation of patgen. Unfortunately he wrote in rather unreadable hand-writing on a flip-chart, which made only the first line audience to actually see what he was writing.Federico Garcia-De Castro – TeXcel?
As an artist organizing large festivals Federico has to fight with financial planning and reports. He seemed not content with the abilities of the usual suspects, so he developed a way to do Excel like book-keeping in TeX. Nice idea, I hope I can use that system for the next conference I have to organize!Jennifer Claudio – A brief reflection on TeX and end-user needs
Last speaker in the morning session was Jennifer who gave us a new and end-user’s view onto the TeX environment, and the respective needs. These kind of talks are a very much welcomed contrast to technical talks and hopefully all of us developers take home some of her suggestions.Sungmin Kim, Jaeyoung Choi, Geunho Jeong – MFCONFIG: Metafont plug-in module for the Freetype rasterizer
Jaeyoung reported about an impressive project to make Metafont fonts available to fontconfig and thus windowing systems. He also explained their development of a new font format Stemfont, which is a Metafont-like system that can work also for CJK fonts, and which they envisage to be built into all kind of mobile devices.Michael Sharpe – New font offerings — Cochineal, Nimbus15 and LibertinusT1Math
Michael reports about his last font projects. The first two being extensions of the half-made half-butchered rereleased URW fonts, as well as his first (?) math font project.
I talked to him over lunch one day, and asked him how many man-days he need for these fonts, and his answer was speaking a lot: For the really messed up new URW fonts, like Cochineal, he guessed about 5 man-months of work, while other fonts only needed a few days. I think we all can be deeply thankful to all the work he is investing into all these font projects.Robert Bringhurst – The evolution of the Palatino tribe
The second invited talk was Robert Bringhurst, famous for his wide contributions to typpography, book culture in general, as well as poetry. He gave a quick historic overview on the development of the Palatino tribe of fonts, with lots of beautiful photos.
I was really looking forward to Robert’s talk, and my expectations were extremely high. And unfortunately I must say I was quite disappointed. Maybe it is his style of presentation, but the feeling he transfered to me (the audience?) was that he was going through a necessary medical check, not much enjoying the presentation. Also, the content itself was not really full of his own ideas or thoughts, but a rather superficial listing of historical facts.
Of course, a person like Robert Bringhurst is so full of anecdotes and background knowledge still was a great pleasure to listen and lots of things to learn, I only hoped for a bit more enthusiasm.TUG Annual General Meeting
The afternoon session finished with the TUG Annual General Meeting, reports will be sent out soon to all TUG members.Herbert Schulz – Optional workshop: TeXShop tips & tricks
After the AGM, Herbert from MacTeX and TeXShop gave an on-the-spot workshop on TeXShop. Since I am not a Mac user, I skipped on that.
Another late afternoon program consisted of an excursion to Eliot’s bookshop, where many of us stacked up on great books. This time again I skipped and took a nap.
In the evening we had a rather interesting informal dinner in the food court of some building, where only two shops were open and all of us lined up in front of the Japanese Curry shop, and then gulped down from plastic boxes. Hmm, not my style I have to say, not even for informal dinner. But at least I could meet up with a colleague from Debian and get some gpg key signing done. And of course, talking to all kind of people around.
The last step for me was in the pub opposite the hotel, with beer and whiskey/scotch selected by specialists in the field.
Last weekend I had to stay at home, so I did some more virtual training (slowly, in order to not overwork myself again). This time, after all the Zwift, I wanted to test something else: Tacx Trainer Software. Still virtual, but of a different kind.
The difference between Zwift, which does video-game-like worlds, is that TTS, in the configuration that I used, uses a real-life video which scrolls faster or slower, based on your speed. This speed adjustment is so-so, but the appeal was that I could ride roads that I actually know and drove before. Modern technology++!
And this was the interesting part: I chose for the first ride the road up to Cap de Formentor, which is one of my favourite places in Mallorca. The road itself is also nice, through some very pleasant woods and with some very good viewpoints, ending at the lighthouse, from where you have wonderful views of the sea.
Now, I've driven two times on this road, so I kind of remembered it, but driving a road and cycling the same road, especially when it goes up and down and up, are very different things. I remembered well the first uphill (after the flat area around Port de Pollença), but after that my recollection of how much uphill the road goes was slightly off, and I actually didn't remember that there's that much downhill, which was a very pleasant surprise. I did remember the view points (since I took quite a few pictures along the road), but otherwise I was completely off about the height profile of the road. Interesting how the brain works ☺
Overall, this is considered a "short" ride in Tacx's film library; it was 21Km, 835m uphill, and I did it in 1h11m, which for me, after two weeks of no sports, was good enough. Also Tacx has bike selection, and I did this on a simulated mountain bike, with the result that downhill speeds were quite slow (max. 57Km/h, at a -12% grade), so not complaining at all.
Next I'll have to see how the road to Sa Calobra is in the virtual world. And next time I go to Mallorca (when/if), I'll have to actually ride these in the real world.
In the meantime, some pictures from an actual trip there. I definitely recommend visiting this, preferably early in the morning (it's very crowded):
A few more pictures and larger sizes here.
I have a long overdue blog entry about what happened in recent times. People that follow my tweets did catch some things. Most noteworthy there was the Trans*Inter*Congress in Munich at the start of May. It was an absolute blast. I met so many nice and great people, talked and experienced so many great things there that I'm still having a great motivational push from it every time I think back. It was also the time when I realized that I in fact do have body dysphoria even though I thought I'm fine with my body in general: Being tall is a huge issue for me. Realizing that I have a huge issue (yes, pun intended) with my length was quite relieving, even though it doesn't make it go away. It's something that makes passing and transitioning for me harder. I'm well aware that there are tall women, and that there are dedicated shops for lengthy women, but that's not the only thing that I have trouble with. What bothers me most is what people read into tall people: that they are always someone they can lean on for comfort, that tall people are always considered to be self confident and standing up for themselves (another pun, I know ... my bad).
And while I'm fine with people coming to me for leaning on to, I rarely get the chance to do so myself. And people don't even consider it. When I was there in Munich, talking with another great (... pun?) trans woman who was as tall as me I finally had the possibility to just rest my head on her shoulder and finally feel the comfort I need just as much as everyone else out there, too. Probably that's also the reason why I'm so touchy and do go Free Hugging as often as possible. But being tall also means that you are usually only the big spoon when cuddling up. Having a small mental breakdown because of realizing that didn't change the feeling directly but definitely helped with looking for what I could change to fix that for myself.
Then, at the end of may, the movie FtWTF - female to what the fuck came to cinema. It's a documentary about six people who got assigned female at birth. And it's absolutely charming, and has great food for thoughts in it. If you ever get the chance to watch it you definitely should.
And then came debconf16 in Capetown. The flight to there was canceled and we had to get rebooked. The first offer was to go through Dubai, and gladly a colleague did point out to the person behind the desk that that wouldn't be safe for myself and thus out of scope. In the end we managed to get to Capetown quite nice, and even though it was winter when the sun was shining it was quite nice. Besides the cold nights that is. Or being stuck on the way up to table mountain because a colleague had cramps in his lags and we had to call mountain rescue. Gladly the night was clear, and when the mountain rescue finally got us to top and it was night already we had one of the nicest views from up there most people probably never will experience.
And then ... I got invited to a trans meetup in Capetown. I was both excited and nervous about it, what to expect there. But it was simply great. The group there was simply outstandingly great. The host gave update information on progress on clinical support within south Africa, from what I took with me is that there is only one clinic there for SRS which manages only two people a year which is simply ... yuck. Guess you can guess how many years (yes, decades) the waiting line is ... I was blown away though by the diversity of the group, on so many levels, most notably on the age spectrum. It was a charm to meet you all there! If you ever stop by in Capetown and you are part of the LGBTIQ community, make sure you get in contact with the Triangle Project.
But, about the real reason to write this entry: I was approached at Debconf by at least two people who asked me what I thought about creating an LGBTIQA+ group within Debian, and if I'd like to push for that. Actually I think it would be a good idea to have some sort of exchange between people on the queer spectrum (and I hope I don't offend anyone with just saying queer for LGBTIQA+ people). Given that I'm quite outspoken people approach me every now and then so I'm aware that there is a fair amount of people that would fall into that category. On the other hand some of them wouldn't want to have it publicly known because it shouldn't matter and isn't really the business of others.
So I'm uncertain. If we follow that path I guess something that is closed or at least offers the possibility to have a closed communication would be needed to not out someone by just joining in the discussion. It's was easier with Debian Women where it was (somewhat) clear that male participants are allies supporting the cause and not considered being women themselves, but often enough (mostly cis hetero male) people are afraid to join a dedicated LGBTIQA+ group because they have the fear of having their identity judged. These things should be considered before creating such a place so that people can feel comfortable when joining and know what to expect beforehand.
For the time being I created #debian-diversity on irc.debian.org to discuss how to move forward. Please bear in mind that even the channel name is up for discussion. Acronyms might not be the way to go in my opinion, just read back up the discussion that lead to the Diversity Statement of Debian where the original approach was to start listing groups for inclusiveness but it was quickly clear that it can get outdated too easily.
I am willing to be part of that effort, but right now I have some personal things to deal which eat up a fair amount of my time. My kid starts school in September (yes, it's that long already, time flies ...). And it looks like I'll have to move a second time in the near future: I'll have to leave my current flat by the end of the year and the Que[e]rbau I'm moving into won't be ready by that time to host me yet ... F*ck. :(
The first day of the real conference started with an excellent overview of what one can do with TeX, spanning from traditional scientific journal styles to generating router configuration for cruising ships.
All this was crowned with an invited talk my Kevin Larson from Microsoft’s typography department on how to support reading comprehension.Pavneet Aurora – Opening: Passport to the TeX canvas
Pavneet, our never-sleeping host and master of organization, opened the conference with a very philosophical introduction, touching upon a wide range of topics ranging from Microsoft, Twitter to the beauty of books, pages, and type. I think at some point he even mentioned TeX, but I can’t remember for sure. His words put up a very nice and all-inclusive stage, a community that is open to all kind of influences with any disregard or prejudice. Let us hope that is reflects reality. Thanks Pavneet.Geoffrey Poore – Advances in PythonTeX
Our first regular talk was by Geoffrey reporting on recent advances in PythonTeX, a package that allows including python code in your TeX document. Starting with an introduction to PythonTeX, Geoggrey reports about an improved verbatim environment, fvextra, which patches fancyvrb, and improved interaction between tikz and PythonTeX.
As I am a heavy user of listings for my teaching on algebraic specification languages, I will surely take a look at this package and see how it compares to listings.Stefan Kottwitz – TeX in industry I: Programming Cisco network switches using TeX
Next was Stefan from Lufthansa Industry Solutions, who reported first about his working environment, Cruise Ships with a very demanding IT infrastructure he has to design and implement. Then he introduced us to his way of generating IP configurations for all the devices using TeX. The reason he chose this method is that it allows him to generate at the same time proper documentation.
It was surprising for me to hear that by using TeX he could far more efficiently and quicker produce well designed and easily accessible documentation, which helped both the company as well as made the clients happy!Stefan Kottwitz – TeX in industry II: Designing converged network solutions
After a coffee break, Stefan continued his exploration into industrial usage of TeX, this time about using tikz to generate graphics representing the network topology on the ships.Boris Veytsman – Making ACM LaTeX styles
Next up was Boris which brought us back to traditional realms of TeX when he guided us into the abyss of ACM LaTeX styles he tried to maintain for some time, until he plunged into a complete rewrite of the styles.Frank Mittelbach – Alice goes floating — global optimized pagination including picture placements
The last talk before lunch (probably a strategic placement, otherwise Frank would continue for hours and hours) was Frank on global optimization of page breaks. Frank showed us what can and can not be done with current LaTeX, and how to play around with global optimization of pagination, using Alice in Wonderland as running example. We can only hope that his package is soon available in an easily consumable version to play around.Thai lunch
Pavneet has organized three different lunch-styles for the three days of the conference, today’s was Thai with spring rools, fried noodles, one kind of very interesting orange noodles, and chicken something.Michael Doob – baseball rules summary
After lunch Michael gave us an accessible explanation of the most arcane rules a game can have – the baseball rules – by using pseudo code. I think the total number of loc needed to explain the overall rules would fill more pages than the New York phonebook, so I am deeply impressed by all those who can understand these rules. Some of us even wandered off in the late afternoon to see a match with life explanations of Michael.Amartyo Banerjee, S.K. Venkatesan – A Telegram bot for printing LaTeX files
Next up was Amartyo who showed a Telegram (as in messenger application) bot running on a Raspberry Pi, that receives (La)TeX files and sends back compiled PDF files. While it is not ready for consumption (If you sneeze the bot will crash!), it looks like a promising application. Furthermore, it is nice to see how open APIs (like Telegram) can spur development of useful tools, while closed APIs (including threatening users, like WhatApp) hinders it.Norbert Preining – Security improvements in the TeX Live Manager and installer
Next up was my own talk about beefing up the security of TeX Live by providing integrity and authenticity checks via GnuPG, a feature that has been introduced with the recent release of TeX Live 2016.
The following discussion gave me several good idea on how to further improve security and usability.Arthur Reutenauer -The TeX Live M sub-project (and open discussion)
Arthur presented the TeX Live M (where the M stands supposedly for Mojca, who couldn’t attend unfortunately) project: Their aim is to provide a curated and quality verified sub-part of TeX Live that is sufficiently complete for many applications, and easier for distributors and packagers.
We had a lively discussion after Arthur’s short presentation, mostly about why TeX Live does not have a “on-the-fly” installation like MikTeX. I insisted that this is already possible, using the “tex-on-the-fly” package which uses the mktextex infrastructure, but also caution against using it by default due to delays induced by repeatably reading the TeX Live database. I think this is a worth-while project for someone interested in learning the internals of TeX Live, but I am not sure whether I want to invest time into this feature.
Another discussion point was about a testing infrastructure, which I am currently working on. This is in fact high on my list, to have some automatic minimal functionality testing – a LaTeX package should at least load!Kevin Larson – Reading between the lines: Improving comprehension for students
Having a guest from Microsoft is rather rare in our quite Unix-centered environment, so big thanks to Pavneet again for setting up this contact, and big thanks to Kevin for coming.
Kevin gave us a profound introduction to reading disabilities and how to improve reading comprehension. Starting with an excursion into what makes a font readable and how Microsoft develops optimally readable fonts, he than turned to reading disabilities like dyslexia, and how markup of text can increase students comprehension rate. He also toppled my long-term believe that dyslexia is connected to the similar shape of letters which are somehow visually malprocessed – this was the scientific status from the 1920ies till the 70ies, but since then all researchers have abandoned this interpretation and dyslexia is now linked to problems linking shape to phonems.
Kevin did an excellent job with a slightly difficult audience – some people picking about grammer differences between British and US English and permanently derailing the discussion, and even more the high percentage of typographically somehow specially tasted participants.
After the talk I had a lengthy discussion with Kevin about if/how this research can be carried over to non-Roman writing systems, in particular Kanji/Hanzi based writing systems, where dyslexia shows itself probably in different context. Kevin also mentioned that they want to add interword space to Chinese to help learners of Chinese (children, foreigners) to better parse, and studies showed that this helps a lot in comprehension.
On a meta-level, this talk bracketed with the morning introduction by Pavneet, describing an open environment with stimulus back and forth in all directions. I am very happy that Kevin took the pain to come in his tight schedule, and I hope that the future will bring better cooperation – at the end we are all working somehow on the same front – only the the tools differ.
After the closing of the session, one part of our group went off to the baseball match, while another group dived into a Japanese-style Izakaya where we managed to kill huge amounts of sake and quite an amount of food. The photo shows me after the first bottle of sake, while just seeping on an intermediate small amount of genshu (kind of strong undiluted sake) before continuing to the next bottle.
An interesting and stimulating first day of TUG, and I am sure that everyone was looking forward to day 2.
This particular week has been tiresome as I did catch a cold ;). I did come back from Cape Town where debconf taking place. My arrival at Montreal was in the middle of the week, so this week is not plenty of news…What I’ve done
I have synced myself with my coworker Nicolas Reynaud, who’s working on building the indexation system over the DHT. We have worked together on critical algorithms: concurrent maintenance of data in the trie (PHT).Week 9 What I’ve done
Since my mentor, who’s also the main author of OpenDHT, was gone for presenting Ring at the RMLL, I’ve been attributed tasks that needed attention quickly. I’ve been working on making OpenDHT run properly when compiled with some failing version of Apple’s LLVM. I’ve had the pleasure of debugging obscure runtime errors that you don’t get depending on the compiler you use, but I mean very obscure.
I have released OpenDHT 0.6.2! This release was meant to fix a critical functionality bug that would arise if one of the two routing table (IPv4, IPv6) was empty. This was really critical for Ring to have the 0.6.2 version because it is not rare that a user connects to some router not giving IPv6 address.
Finally, I have fixed some minor bugs in my work on the queries.
reprotest 0.2 is available in PyPi and should hit Debian soon. I have tested null (no container, build on the host system), schroot, and qemu, but it's likely that chroot, Linux containers (lxc/lxd), and quite possibly ssh are also working. I haven't tested the autopkgtest code on a non-Debian system, but again, it probably works. At this point, reprotest is not quite a replacement for the prebuilder script because I haven't implemented all the variations yet, but it offers better virtualization because it supports qemu, and it can build non-Debian software because it doesn't rely on pbuilder.
With HW42's help, I fixed the schroot/disorderfs/autopkgtest permission error and got pip/setuptools to install the autopkgtest code from virt/. The permission error came from autopkgtest automatically running all commands on a testbed as root, if it can run commands as root, which caused schroot to mount disorderfs as root. This caused the build artifact to be owned by root, but unlike qemu, a chroot shares the same file system, so it would then try to copy this root-owned file with a process running with ordinary user permissions. The fix was to mount disorderfs with --multi-user=yes when the container has root permissions, allowing a user process to access it even when it's mounted by root. The fix for setuptools is an ugly hack: including the virt/ scripts as non-code data with include reprotest/virt/* in MANIFEST.in works. (According to the documentation, graft reprotest/virt/, which I also tried, ought to work, but in my testing it doesn't.) This still uses sys.path hacking as a consequence of autopkgtest's design choices.
For the next release, I want to finish all the remaining variations, which at this point are build path, domain, host, group, shell, and user. I also want to ensure that reprotest runs on non-Debian and possibly non-Linux systems.
For now, what I need most is more real-world testing. If you have any interest in reproducible builds, please try reprotest out! There's a README included which describes how to set up appropriate containers and run the tests.
The second pre-conference day was dedicated to books and beers, with a visit to an exquisite print studio, and a beer tasting session at one of the craft breweries in Canada. In addition we could grab a view into the Canadian lifestyle by visiting Pavneet’s beautiful house in the countryside, as well as enjoying traditional style pastries from a bakery.
In short, a perfect combination for us typography and beer savvy freaks!
This morning we had somehow an early start from the hotel. Soon the bus left downtown Toronto and entered countryside Ontario, large landscapes filled with huge (for my Japanese feeling) estates and houses, separated by fields, forests and wild landscape. Very beautiful and inviting to live there. On our way to the printing workshop we stopped at Pavneet’s house for a very short visit of the exterior, which includes mathematics in the bricking. According to Pavneet, his kids hate to see math on the wall – I would be proud to have it.
A bit further on we entered into Erin, where the Porcupine’s Quill is located. A small building along the street, one could easily oversee that rare jewel! In addition considering that according to the owners, Google Maps has a bad error which would lead you to a completely different location. This printing workshop, led by Tim and Elke Inkster, is producing books in a traditional style using an old Heidelberg Offset printing machine.
Elke introduced us to the sewing of folded signatures together with a lovely old sewing machine. It was the first time I actually saw one in action.
Tim, the head master of the printing shop, first entertained us with stories about Chinese publishers visiting them in the old cold-war times, before diving into explanations of the actual machines around, like the Heidelberg offset printing machine.
In the back of the basement of the little studio there is also a huge folding machine, which cuts up the big signatures of 16 pages and folds them into bundles. An impressive example of tricky engineering.
Due to the small size of the printing studio, our groups were actually split into two groups, and while the other group got its guided tour, we grabbed coffee and traditional cookies and pastries from the nearby Holtom’s bakery. Loads of nice pastries with various filling, my favorite being the slightly salty cherry pie, and above all the rhubarb-raspberry pie.
To my absolute astonishment I also found there a Viennese “Kaisersemmel“, called “Kaiser bun” here, but keeping the shape and the idea (but unfortunately not the crispy cracky quality of the original in Vienna). Of course I got two of them, together with a nice jam from the region, and enjoyed these “Viennese breakfast” the next day morning.
Leaving the Quill we headed for a lunch in a nice pizzeria (I got Pizza Toscana) which also served excellent local beer – how would I like to have something like this over in Japan! Our last stop on this excursion was Stone Hammer Brewery, ne of the most famous craft breweries in Canada.
Although they wont win a prize for typography (besides one page of a coaster there that carried a nice pun), their beers are exquisite. We got five different beers to taste, plus extensive explanations on brewing methods and differences. Now I finally understand why most of the new craft breweries in Japan are making Ales: Ales don’t need a long process and are ready for sale in rather short time, compared to e.g. lagers.)
Filled with excellent beer, some of us (notably an unnamed US TeXnician and politician), stacked up on beers to carry home. I was very tempted to get a huge batch, but putting cans into an airplane seems to be not the optimal idea. Since we are talking cans, I was surprised to hear that many craft beer brewers nowadays prefer cans due to their better protection of the beer from light and oxygen, both killers of good beer.
Before leaving we took a last look at the Periodic Table of Beer Types, which left me in awe about how much I don’t know and I probably never will know. In particular, I heard the first time of a “Vienna style beer” – Vienna is not really famous for beer, better to say, it is infamous. So maybe this is a different Vienna than my home town that is meant here.
Another two hour bus ride brought us back to Toronto, where we met with other participants at the reception in a restaurant of Mediterranean cuisine, where I could enjoy for the first time in years a good Tahina and Humus.
All around another excellent day, now I only would like to have two days of holidays, guess I need to relax in the lectures starting from tomorrow.
I think my demands for a terminal emulator are pretty basic but none the less I run into trouble every now and then. This time it was a new laptop and starting from scratch with an empty $HOME and the current Debian/testing instead of good old Jessie.
For the last four or five years I've been a happy user of gnome-terminal, configured a mono space font, a light grey background with black text color, create new tabs with Ctrl-n, navigate the tabs with Ctrl-Left and Ctrl-Right, show no menubar, select URLs with double click. Suited me well with my similarly configured awesome window manager, where I navigate with Mod4-Left and Mod4-Right between the desktops on the local screen and only activate a handful of the many default tiling modes.
While I could get back most of my settings, somehow all cited gconf kung-foo to reconfigure the URL selection pattern in gnome-terminal failed, and copy&pasting URLs from the terminal was a pain in the ass. Long story short I now followed the advice of a coworker to just use the xfce4-terminal.
That still required a few tweaks to get back to do what I want it to do. To edit the keybindings you've to know that you've to use the GTK way and edit them within in the menu while selecting the menu entry. But you've to allow that first (why oh why?):echo "gtk-can-change-accels=1" >> ~/.gtkrc-2.0
Fair enough that is documented. Changing the keybinding generates fancy things in ~/.config/xfce4/terminal/accels.scm in case you plan to hand edit a few more of them.
I also edited a few things in ~/.config/xfce4/terminal/terminalrc:MiscAlwaysShowTabs=TRUE MiscMenubarDefault=FALSE
So I guess I can remove gnome-terminal for now and stay with another GTK2 application. Doesn't feel that good but well at least it works.
Git-cinnabar is a git remote helper to interact with mercurial repositories. It allows to clone, pull and push from/to mercurial remote repositories, using git.
These release notes are also available on the git-cinnabar wiki.What’s new since 0.4.0b1?
- Some more bug fixes.
- Updated git to 2.9.2 for cinnabar-helper.
- Now supports `git push –dry-run`.
- Added a new `git cinnabar fetch` command to fetch a specific revision that is not necessarily a head.
- Some improvements to the experimental native wire protocol support.
This years TUG is held in Toronto, Canada, and our incredible host Pavneet has managed to put together an busy program of excursions and events around the real conference. The -1st day (yeah, you read right, the minus-first day), that is two days before the actual conference starts, was dedicated to an excursion to enjoying wines at the wine estate Château des Charmes, followed by a visit to the Niagara falls.
What should I say, if the first thing after getting out of the bus is a good wine, then there is nothing to go wrong …
I arrived in Toronto already two days earlier in the late afternoon, and spend Friday relaxing, recovering from the long flight, walking the city a bit, trying to fix my cold, and simply going slowly. Saturday morning we met already at a comfortable 10am in the morning (still to early for me, due to jet lag and a slightly late evening before), but the first 2h of bus drive allowed us to relax. Our first stop was the Château des Charmes, and impressive building surrounded by vineyards.
We were immediately handed over to a sandwich lunch with white, red, and ice wine. Good start! And although the breakfast wasn’t that long time ago, the delicious sandwiches (at least the vegetarian ones I tried) were good foundation for the wine.
After having refilled our energy reserves, we are ready to start our tour. Our guide, very, if not over, enthusiastic, explains us that practically everything related to wine in Canada has been started at this Château from the current owner – the Château system where farmer and wine producer are the same, import of European grapes, winter protection methods, wine making – I was close to forgetting our Roman and Greek ancestors. At least she admitted that the Ice wine was brought over by an Austrian – but perfection was done here, were the controls of the government are much stricter than anywhere else … hmmm, somehow I cannot completely believe all this narrative, but at least it is enjoyable. So now that we know all about the history, we dive into the production process area, and the barrel space, always accompanied with extensive comments and (self-)praise.
After this exhaustive and exhausting round, we are guided back to the patio to taste another three different wines, a white (bit too warm, not so much my taste), a rose (very good), and a red made from a new grape variety that has mutated first here on the Château (interesting). As I didn’t have enough, I tried to get something out of the big containers directly, but without success!
Happy and content, and after passing through the shopping area, we are boarding the bus to continue direction of the Niagara Falls. Passing by at quite some nice houses of definitely quite rich people (although Pavneet told me that houses in Toronto are far more expensive than those here – how can someone afford this?), we first have a view onto the lower Niagara river. A bit further one we are let out to see huge whirlpools in the river, where boat tours are bringing sightseers on a rough ride ride into the pool.
Only a slight bit further on we finally reach the Niagara falls, with the great view onto the American falls in full power, and the Horseshoe Falls further up.
We immediately boarded a boat tour making a short trip to the Horseshoe Fall. Lining up with hundreds and hundreds of other spectators, we prepare for the splash with red rain wear (the US side uses blue, not that any side would rescue a wrong person and create an illegal immigrant!). The trip passes first under the American Falls and continues right into the mist that fills all the area in the middle of the Horseshoe falls. Spectacular impression with walls of water falling down on both sidess.
Returned from the splash and having dried our feet, we walk along the ridge to see the Horseshoe falls from close up. The amount of water falling down these falls is incredible, and so is the erosion that creates the brown foam on the water down in the pool, made up from pulverized limestone. Blessed as we were, the sun was laughing all the day and we got a nice rainbow right in the fall.
The surroundings of the fall are less impressive – Disney land? Horror cabinet? Jodel bar? A wild mixture of amusement park style locations squeezed together and overly full with people – as if enjoying the nature itself would not be enough. All immersed by ever blasting loudspeaker music.
The only plus point I could find in this encampment of forced happiness was a local craft beer brewer where one could taste 8 different beers – I made it only to four, though.
Finally night was falling in and we moved down to the falls again to enjoy the illumination of the falls.
After this wonderful finish we boarded the bus and back to Toronto, where we arrived around mid-night. A long but very pleasurable Day Minus One!
Further photos can be seen at the gallery TUG 2016 – Day Minus One.
I became interested in running Debian on NVIDIA's Tegra platform recently. NVIDIA is doing a great job getting support for Tegra upstream (u-boot, kernel, X.org and other projects). As part of ensuring good Debian support for Tegra, I wanted to install Debian on a Jetson TK1, a development board from NVIDIA based on the Tegra K1 chip (Tegra 124), a 32-bit ARM chip.
Ian Campbell enabled u-boot and Linux kernel support and added support in the installer for this device about a year ago. I updated some kernel options since there has been a lot of progress upstream in the meantime, performed a lot of tests and documented the installation process on the Debian wiki. Wookey made substantial improvements to the wiki as well.
If you're interested in a good 32-bit ARM development platform, give Debian on the Jetson TK1 a try.
There's also a 64-bit board. More on that later...
Everyone knows that today the finish line for Le Tour de France was crossed on Les Champs-Élysées in Paris... And if you haven't seen some of the videos I highly recommend checking out the onboard camera views and the landscapes! Quel beau pays
I'm happy to let you know that today I won the Tour de Crosstown 2016 which is the cycling competition at Lifetime Crosstown inspired by and concurrent to Le Tour de France. There were about twenty cyclists competing to see who could earn the most points -- by attending cycling class bien sûr. I earned the maillot jaune with 23 points and my next closest competitor had 16 points (with the peloton far behind). But that's just part of the story.
For some time I've been coming to Life Time Fitness at Crosstown for yoga (in Josefina's class) and playing racquetball with my friend David. The cycling studio is right next to the racquetball courts and there's been a class on Saturday's at the same time we usually play. I told David that it looked like fun and he said, having tried it, that it is fun (and a big workout). In early June David got busy and then had an injury that has kept him off the court ever since. So one Saturday morning I decided to try cycling.
I borrowed a heart rate monitor (but had no idea what it was for) and tried to bike along in my regular gym shorts, shoes and a t-shirt. Despite being a cycling newbie I was immediately captured by Alison's music and enthusiasm. She's dancing on her bike and you can't help but lock in the beat. Of course that's just after she tells you to dial up the resistance... and the sweat just pours out!
I admit that workout hit me pretty hard, but I had to come back and try the 5:45 am Wednesday EDGE cycle class (gulp). Despite what sounds like a crazy impossible time to get out and on a bike it actually works out super well. This plan requires one to up-level one's organization and after the workout I can assure you that you're fully awake and charged for the day!
Soon I invested in my own heart rate monitor. Then I realized it would work so much better if I had a metabolic assessment to tune my aerobic and anaerobic training zones. While I signed up for the assessment I decided to work with May as my personal trainer. In addition to helping me with my upper body (complementing the cycling) May is a nutritionist and has helped me dial in this critical facet of training. Even though I'm still working to tune my diet around my workouts, I've already learned a lot by using My Fitness Pal and, most importantly, I have a whole new attitude about food.
Pour les curieux, la nutritioniste maison s'est absentée en France pendant le mois de juillet.
Soon I would invest in bike shoes, jerseys and shorts and begin to push myself into the proper zones during workouts and fuel my body properly afterwords. All these changes have led to dramatic weight loss \o/
A few of you know that the past two years have involved a lot of personal hardship. Upon reflection I have come to appreciate that things in my life that I can actually control are a massive opportunity. I decided that fixing my exercise and nutrition were the opportunities I want to focus on. A note for for my Debian friends... I'm sorry to have missed you in Cape Town, but I hope to join you in Montréal next year.
So when the Tour de Crosstown started in July I decided this was the time for me to get serious. I want to thank all the instructors for the great workouts (and for all the calories I've left on the bike): Alison, Kristine, Olivia, Tasha, and Caroline!
The result of my lifestyle changes are hard to describe.. I feel an amazing amount of energy every day. The impact of prior back injury is now almost non-existent. And what range of motion I hadn't recovered from the previous summer's being "washing machined" by a 3 meter wave while body surfing at the beach in Hossegor is now fully working.
Now I'm thinking it's time to treat myself to a new bike I'm looking at large touring frames and am currently thinking of the Surly Disc Trucker. In terms of bike shops I've had a good experience with One on One and Grand Performance has come highly recommended. If anyone has suggestions for bikes, bike features, or good shops please let me know!
I would encourage everyone here in Minneapolis to join me as guest for a Wed morning 5:45am EDGE cycle class. I'm betting you'll have as much fun as a I do.. and I guarantee you will sweat! The challenge in waking up will pay off handsomely in making you energized for the whole day.
Let's bike allons-y!
The second Armadillo release of the 7.* series came out a few weeks ago: version 7.200.2. And RcppArmadillo version 0.7.200.2.0 is now on CRAN and uploaded to Debian. This followed the usual thorough reverse-dependecy checking of by now over 240 packages using it.
For once, I let it simmer a little preparing only a package update via the GitHub repo without preparing a CRAN upload to lower the update frequency a little. Seeing that Conrad has started to release 7.300.0 tarballs, the time for a (final) 7.200.2 upload was now right.
Just like the previous, it now requires a recent enough compiler. As g++ is so common, we explicitly test for version 4.6 or newer. So if you happen to be on an older RHEL or CentOS release, you may need to get yourself a more modern compiler. R on Windows is now at 4.9.3 which is decent (yet stable) choice; the 4.8 series of g++ will also do. For reference, the current LTS of Ubuntu is at 5.4.0, and we have g++ 6.1 available in Debian testing.
This new upstream release adds new indexing helpers, additional return codes on some matrix transformations, increased speed for compound expressions via vectorise, corrects some LAPACK feature detections (affecting principally complex number use under OS X), and a rewritten sample() function thanks to James Balamuta.
Armadillo is a powerful and expressive C++ template library for linear algebra aiming towards a good balance between speed and ease of use with a syntax deliberately close to a Matlab.
Changes in this release (and the preceding GitHub-only release 0.7.200.1.0 are as follows:Changes in RcppArmadillo version 0.7.200.2.0 (2016-07-22)
Upgraded to Armadillo release 7.200.2
The sampling extension was rewritten to use Armadillo vector types instead of Rcpp types (PR #101 by James Balamuta)
Upgraded to Armadillo release 7.200.1
added .index_min() and .index_max()
expanded ind2sub() to handle vectors of indices
expanded sub2ind() to handle matrix of subscripts
expanded expmat(), logmat() and sqrtmat() to optionally return a bool indicating success
faster handling of compound expressions by vectorise()
The configure code now (once again) sets the values for the LAPACK feature #define correctly.
After many days of failed attempts, yesterday @Diego Roversi finally managed to setup SPI on the BeagleBone White¹, and that means that today at our home it was Laptop Liberation Day!
We took the spare X200, opened it, found the point we were on in the tutorial installing libreboot on x200 https://libreboot.org/docs/install/x200_external.html, connected all of the proper cables on the clip³ and did some reading tests of the original bios.
While the tutorial mentioned a very conservative setting (512kHz), just for fun we tried to read it at different speed and all results up to 16384 kHz were equal, with the first failure at 32784 kHz, so we settled on using 8192 kHz.
Then it was time to customize our libreboot image with the right MAC address, and that's when we realized that the sheet of paper where we had written it down the last time had been put in a safe place… somewhere…
Luckily we also had taken a picture, and that was easier to find, so we checked the keyboard map², followed the instructions to customize the image https://libreboot.org/docs/hcl/gm45_remove_me.html#ich9gen, flashed the chip, partially reassembled the laptop, started it up and… a black screen, some fan noise and nothing else.
We tried to reflash the chip (nothing was changed), tried the us keyboard image, in case it was the better tested one (same results) and reflashed the original bios, just to check that the laptop was still working (it was).
It was lunchtime, so we stopped our attempts. As soon as we started eating, however, we realized that this laptop came with 3GB of RAM, and that surely meant "no matching pairs of RAM", so just after lunch we reflashed the first image, removed one dimm, rebooted and finally saw a gnu-hugging penguin!
We then tried booting some random live usb key https://tails.boum.org/ we had around (failed the first time, worked the second and further one with no changes), and then proceeded to install Debian.
Running the installer required some attempts and a bit of duckduckgoing: parsing the isolinux / grub configurations from the libreboot menu didn't work, but in the end it was as easy as going to the command line and running:
From there on, it was the usual debian installation and a well know environment, and there were no surprises. I've noticed that grub-coreboot is not installed (grub-pc is) and I want to investigate a bit, but rebooting worked out of the box with no issue.
Next step will be liberating my own X200 laptop, and then if you are around the @Gruppo Linux Como area and need a 16 pin clip let us know and we may bring everything to one of the LUG meetings⁴
¹ yes, white, and most of the instructions on the interwebz talk about the black, which is extremely similar to the white… except where it isn't
² wait? there are keyboard maps? doesn't everybody just use the us one regardless of what is printed on the keys? Do I *live* with somebody who doesn't? :D
³ the breadboard in the picture is only there for the power supply, the chip on it is a cheap SPI flash used to test SPI on the bone without risking the laptop :)
⁴ disclaimer: it worked for us. it may not work on *your* laptop. it may brick it. it may invoke a tentacled monster, it may bind your firstborn son to a life of servitude to some supernatural being. Whatever happens, it's not our fault.
seems I've neglected both my blog & my RC bug fixing activities in the last months. – anyway, since I still keep track of RC bugs I worked on, I thought I might as well publish the list:
- #798023 – src:cssutils: "cssutils: FTBFS with Python 3.5"
sponsor NMU by Chris Knadle, upload to DELAYED/2
- #800303 – src:libipc-signal-perl: "libipc-signal-perl: Please migrate a supported debhelper compat level"
bump debhelper compat level, upload to DELAYED/5
- #808331 – src:libpgplot-perl: "libpgplot-perl: needs manual rebuild for Perl 5.22 transition"
manually build+upload packages (pkg-perl)
- #809056 – src:xca: "xca: FTBFS due to CPPFLAGS containing spaces"
sponsor NMU by Chris Knadle, upload to DELAYED/5
- #810017 – src:psi4: "psi4: FTBFS with perl 5.22"
propose a patch
- #810707 – src:libdbd-sqlite3-perl: "libdbd-sqlite3-perl: FTBFS: t/virtual_table/21_perldata_charinfo.t (Wstat: 512 Tests: 4 Failed: 1)"
investigate a bit (pkg-perl)
- #810710 – src:libdata-objectdriver-perl: "libdata-objectdriver-perl: FTBFS: t/02-basic.t (Wstat: 256 Tests: 67 Failed: 1)"
add patch to handle sqlite 3.10 in test suite's version comparison (pkg-perl)
- #810900 – libanyevent-rabbitmq-perl: "libanyevent-rabbitmq-perl: Can't locate object method "bind_exchange" via package "AnyEvent::RabbitMQ::Channel""
add info, versioned close (pkg-perl)
- #810910 – libmath-bigint-gmp-perl: "libmath-bigint-gmp-perl: FTBFS: test failures with newer libmath-bigint-perl"
upload new upstream release (pkg-perl)
- #810912 – libx11-xcb-perl: "libx11-xcb-perl: missing dependency on libxs-object-magic-perl"
update dependencies (pkg-perl)
- #813420 – src:libnet-server-mail-perl: "libnet-server-mail-perl: FTBFS: error: Can't call method "peerhost" on an undefined value at t/starttls.t line 78."
close, as the underlying problem is fixed (pkg-perl)
- #814730 – src:libmath-mpfr-perl: "libmath-mpfr-perl: FTBFS on most architectures"
upload new upstream release (pkg-perl)
- #815775 – zeroc-ice: "Build-Depends on unavailable packages mono-gmcs libmono2.0-cil"
sponsor NMU by Chris Knadle, upload to DELAYED/2
- #816527 – src:libtest-file-contents-perl: "libtest-file-contents-perl: FTBFS with Text::Diff 1.44"
upload new upstream release (pkg-perl)
- #816638 – mhonarc: "mhonarc: fails to run with perl5.22"
propose a patch in the BTS
- #817528 – src:libemail-foldertype-perl: "libemail-foldertype-perl: Removal of debhelper compat 4"
raise debhelper compat level, upload to DELAYED/2
- #817529 – src:libimage-base-bundle-perl: "libimage-base-bundle-perl: Removal of debhelper compat 4"
raise debhelper compat level, upload to DELAYED/2
- #817530 – src:liberror-perl: "liberror-perl: Removal of debhelper compat 4"
raise debhelper compat level, upload to DELAYED/2
- #817531 – src:libimage-info-perl: "libimage-info-perl: Removal of debhelper compat 4"
raise debhelper compat level, upload to DELAYED/2
- #817647 – src:randomplay: "randomplay: Removal of debhelper compat 4"
raise debhelper compat level, upload to DELAYED/2
- #818924 – libjson-webtoken-perl: "libjson-webtoken-perl: missing dependency on libmodule-runtime-perl"
add missing (build) dependency (pkg-perl)
- #819787 – libdbix-class-schema-loader-perl: "libdbix-class-schema-loader-perl: FTBFS: t/10_01sqlite_common.t failure"
close bug, works again with recent sqlite3 (pkg-perl)
- #821412 – libnet-rblclient-perl: "libnet-rblclient-perl: Net::DNS 1.01 breaks Net::RBLClient"
add patch (pkg-perl)
- #823310 – libnanomsg-raw-perl: "libnanomsg-raw-perl: FTBFS: test failures"
add patch from upstream git repo (pkg-perl)
- #824046 – src:libtkx-perl: "libtkx-perl: FTBFS: Tcl error 'Foo at /usr/lib/x86_64-linux-gnu/perl5/5.22/Tcl.pm line 585.\n' while invoking scalar result call"
first investigation (pkg-perl)
- #824143 – libperinci-sub-normalize-perl: "libperinci-sub-normalize-perl: FTBFS: Can't locate Sah/Schema/Rinci.pm in @INC"
upload new upstream version (pkg-perl)
- #825366 – src:libdist-zilla-plugin-ourpkgversion-perl: "libdist-zilla-plugin-ourpkgversion-perl: FTBFS: Can't locate Path/Class.pm in @INC"
add missing dependency (pkg-perl)
- #825424 – libdist-zilla-plugin-test-podspelling-perl: "libdist-zilla-plugin-test-podspelling-perl: FTBFS: Can't locate Path/Class.pm in @INC"
first triaging, forward upstream, import new release (pkg-perl)
- #825608 – libnet-jifty-perl: "libnet-jifty-perl: FTBFS: t/006-uploads.t failure"
triaging, mark unreproducible (pkg-perl)
- #825629 – src:libgd-perl: "libgd-perl: FTBFS: Could not find gdlib-config in the search path. "
first triaging, forward upstream (pkg-perl)
- #829064 – libparse-debianchangelog-perl: "libparse-debianchangelog-perl: FTBFS with new tidy version"
patch TML template (pkg-perl)
- #829066 – libparse-plainconfig-perl: "FTBFS: Can't modify constant item in scalar assignment"
new upstream release (pkg-perl)
- #829176 – src:libapache-htpasswd-perl: "libapache-htpasswd-perl: FTBFS: dh_clean: Please specify the compatibility level in debian/compat"
add debian/compat, upload to DELAYED/2
- #829409 – src:libhtml-tidy-perl: "libhtml-tidy-perl: FTBFS: Failed 7/21 test programs. 8/69 subtests failed."
apply patches from Simon McVittie (pkg-perl)
- #829668 – libparse-debianchangelog-perl: "libparse-debianchangelog-perl: FTBFS: Failed test 'Output of dpkg_str equal to output of dpkg-parsechangelog'"
add patch for compatibility with new dpkg (pkg-perl)
- #829746 – src:license-reconcile: "license-reconcile: FTBFS: Failed 7/30 test programs. 11/180 subtests failed."
versioned close, already fixed in latest upload (pkg-perl)
- #830275 – src:libgravatar-url-perl: "libgravatar-url-perl: accesses the internet during build"
skip test which needs internet access (pkg-perl)
- #830324 – src:libhttp-async-perl: "libhttp-async-perl: accesses the internet during build"
add patch to skip tests with DNS queries (pkg-perl)
- #830354 – src:libhttp-proxy-perl: "libhttp-proxy-perl: accesses the internet during build"
skip tests which need internet access (pkg-perl)
- #830355 – src:libanyevent-http-perl: "libanyevent-http-perl: accesses the internet during build"
skip tests which need internet access (pkg-perl)
- #830356 – src:libhttp-oai-perl: "libhttp-oai-perl: accesses the internet during build"
add patch to skip tests with DNS queries (pkg-perl)
- #830476 – src:libpoe-component-client-http-perl: "libpoe-component-client-http-perl: accesses the internet during build"
update existing patch (pkg-perl)
- #831233 – src:libmongodb-perl: "libmongodb-perl: build hangs with sbuild and libeatmydata"
lower severity (pkg-perl)
- #832361 – src:libmousex-getopt-perl: "libmousex-getopt-perl: FTBFS: Failed 2/22 test programs. 2/356 subtests failed."
upload new upstream release (pkg-perl)
Review: The Run of His Life, by Jeffrey ToobinPublisher: Random House Copyright: 1996, 1997 Printing: 2015 ISBN: 0-307-82916-2 Format: Kindle Pages: 498
The O.J. Simpson trial needs little introduction to anyone who lived through it in the United States, but a brief summary for those who didn't.
O.J. Simpson is a Hall of Fame football player and one of the best running backs to ever play the game. He's also black, which is very relevant much of what later happened. After he retired from professional play, he became a television football commentator and a spokesperson for various companies (particularly Hertz, a car rental business). In 1994, he was arrested for the murder of two people: his ex-wife, Nicole Brown Simpson, and Ron Goldman (a friend of Nicole's). The arrest happened after a bizarre low-speed police chase across Los Angeles in a white Bronco that was broadcast live on network television. The media turned the resulting criminal trial into a reality TV show, with live cable television broadcasts of all of the court proceedings. After nearly a full year of trial (with the jury sequestered for nine months — more on that later), a mostly black jury returned a verdict of not guilty after a mere four hours of deliberation.
Following the criminal trial, in an extremely unusual legal proceeding, Simpson was found civilly liable for Ron Goldman's death in a lawsuit brought by his family. Bizarre events surrounding the case continued long afterwards. A book titled If I Did It (with "if" in very tiny letters on the cover) was published, ghost-written but allegedly with Simpson's input and cooperation, and was widely considered a confession. Another legal judgment let the Goldman family get all the profits from that book's publication. In an unrelated (but also bizarre) incident in Las Vegas, Simpson was later arrested for kidnapping and armed robbery and is currently in prison until at least 2017.
It is almost impossible to have lived through the O.J. Simpson trial in the United States and not have formed some opinion on it. I was in college and without a TV set at the time, and even I watched some of the live trial coverage. Reactions to the trial were extremely racially polarized, as you might have guessed. A lot of black people believed at the time that Simpson was innocent (probably fewer now, given subsequent events). A lot of white people thought he was obviously guilty and was let off by a black jury for racial reasons. My personal opinion, prior to reading this book, was a common "third way" among white liberals: Simpson almost certainly committed the murders, but the racist Los Angeles police department decided to frame him for a crime that he did commit by trying to make the evidence stronger. That's a legitimate reason in the US justice system for finding someone innocent: the state has an obligation to follow correct procedure and treat the defendant fairly in order to get a conviction. I have a strong bias towards trusting juries; frequently, it seems that the media second-guesses the outcome of a case that makes perfect sense as soon as you see all the information the jury had (or didn't have).
Toobin's book changed my mind. Perhaps because I wasn't watching all of the coverage, I was greatly underestimating the level of incompetence and bad decision-making by everyone involved: the prosecution, the defense, the police, the jury, and the judge. This court case was a disaster from start to finish; no one involved comes away looking good. Simpson was clearly guilty given the evidence presented, but the case was so badly mishandled that it gave the jury room to reach the wrong verdict. (It's telling that, in the far better managed subsequent civil case, the jury had no trouble reaching a guilty verdict.)
The Run of His Life is a very detailed examination of the entire Simpson case, from the night of the murder through the end of the trial and (in an epilogue) the civil case. Toobin was himself involved in the media firestorm, breaking some early news of the defense's decision to focus on race in The New Yorker and then involved throughout the trial as a legal analyst, and he makes it clear when he becomes part of the story. But despite that, this book felt objective to me. There are tons of direct quotes, lots of clear description of the evidence, underlying interviews with many of the people involved to source statements in the book, and a comprehensive approach to the facts. I think Toobin is a bit baffled by the black reaction to the case, and that felt like a gap in the comprehensiveness and the one place where he might be accused of falling back on stereotypes and easy judgments. But other than hole, Toobin applies his criticism even-handedly and devastatingly to all parties.
I won't go into all the details of how Toobin changed my mind. It was a cumulative effect across the whole book, and if you're curious, I do recommend reading it. A lot was the detailed discussion of the forensic evidence, which was undermined for the jury at trial but looks very solid outside the hothouse of the case. But there is one critical piece that I would hope would be handled differently today, twenty years later, than it was by the prosecutors in that case: Simpson's history of domestic violence against Nicole. With what we now know about patterns of domestic abuse, the escalation to murder looks entirely unsurprising. And that history of domestic abuse was exceedingly well-documented: multiple external witnesses, police reports, and one actual prior conviction for spousal abuse (for which Simpson did "community service" that was basically a joke). The prosecution did a very poor job of establishing this history and the jury discounted it. That was a huge mistake by both parties.
I'll mention one other critical collection of facts that Toobin explains well and that contradicted my previous impression of the case: the relationship between Simpson and the police.
Today, in the era of Black Lives Matter, the routine abuse of black Americans by the police is more widely known. At the time of the murders, it was less recognized among white Americans, although black Americans certainly knew about it. But even in 1994, the Los Angeles police department was notorious as one of the most corrupt and racist big-city police departments in the United States. This is the police department that beat Rodney King. Mark Fuhrman, one of the police officers involved in the case (although not that significantly, despite his role at the trial), was clearly racist and had no business being a police officer. It was therefore entirely believable that these people would have decided to frame a black man for a murder he actually committed.
What Toobin argues, quite persuasively and with quite a lot of evidence, is that this analysis may make sense given the racial tensions in Los Angeles but ignores another critical characteristic of Los Angeles politics, namely a deference to celebrity. Prior to this trial, O.J. Simpson largely followed the path of many black athletes who become broadly popular in white America: underplaying race and focusing on his personal celebrity and connections. (Toobin records a quote from Simpson earlier in his life that perfectly captures this approach: "I'm not black, I'm O.J.") Simpson spent more time with white businessmen than the black inhabitants of central Los Angeles. And, more to the point, the police treated him as a celebrity, not as a black man.
Toobin takes some time to chronicle the remarkable history of deference and familiarity that the police showed Simpson. He regularly invited police officers to his house for parties. The police had a long history of largely ignoring or downplaying his abuse of his wife, including not arresting him in situations that clearly seemed to call for that, showing a remarkable amount of deference to his side of the story, not pursuing clear violations of the court judgment after his one conviction for spousal abuse, and never showing much inclination to believe or protect Nicole. Even on the night of the murder, they started following a standard playbook for giving a celebrity advance warning of investigations that might involve them before the news media found out about them. It seems clear, given the evidence that Toobin collected, that the racist Los Angeles police didn't focus that animus at Simpson, a wealthy celebrity living in Brentwood. He wasn't a black man in their eyes; he was a rich Hall of Fame football player and a friend.
This obviously raises the question of how the jury could return an innocent verdict. Toobin provides plenty of material to analyze that question from multiple angles in his detailed account of the case, but I can tell you my conclusion: Judge Lance Ito did a horrifically incompetent job of managing the case. He let the lawyers wander all over the case, interacted bizarrely with the media coverage (and was part of letting the media turn it into a daytime drama), was not crisp or clear about his standards of evidence and admissibility, and, perhaps worst of all, let the case meander on at incredible length. With a fully sequestered jury allowed only brief conjugal visits and no media contact (not even bookstore shopping!).
Quite a lot of anger was focused on the jury after the acquittal, and I do think they reached the wrong conclusion and had all the information they would have needed to reach the correct one. But Toobin touches on something that I think would be very hard to comprehend without having lived through it. The jury and alternate pool essentially lived in prison for nine months, with guards and very strict rules about contact with the outside world, in a country where compensation for jury duty is almost nonexistent. There were a lot of other factors behind their decision, including racial tensions and the sheer pressure from judging a celebrity case about which everyone has an opinion, but I think it's nearly impossible to underestimate the psychological tension and stress from being locked up with random other people under armed guard for three quarters of a year. It's hard for jury members to do an exhaustive and careful deliberation in a typical trial that takes a week and doesn't involve sequestration. People want to get back to their lives and families. I can only imagine the state I would be in after nine months of this, or how poor psychological shape I would be in to make a careful and considered decision.
Similarly, for those who condemned the jury for profiting via books and media appearances after the trial, the current compensation for jurors is $15 per day (not hour). I believe at the time it was around $5 per day. There are a few employers who will pay full salary for the entire jury service, but only a few; many cap the length at a few weeks, and some employers treat all jury duty as unpaid leave. The only legal requirement for employers in the United States is that employees that serve on a jury have their job held for them to return to, but compensation is pathetic, not tied to minimum wage, and employers do not have to supplement it. I'm much less inclined to blame the jurors than the system that badly mistreated them.
As you can probably tell from the length of this review, I found The Run of His Life fascinating. If I had followed the whole saga more closely at the time, this may have been old material, but I think my vague impressions and patchwork of assumptions were more typical than not among people who were around for the trial but didn't invest a lot of effort into following it. If you are like me, and you have any interest in the case or the details of how the US criminal justice system works, this is a fascinating case study, and Toobin does a great job with it. Recommended.
Rating: 8 out of 10
I recently mentioned that I'd forked Antirez's editor and added lua to it.
I've been working on it, on and off, for the past week or two now. It's finally reached a point where I'm content:
- The undo-support is improved.
- It has buffers, such that you can open multiple files and switch between them.
- This allows this to work "kilua *.txt", for example.
- The syntax-highlighting is improved.
- We can now change the size of TAB-characters.
- We can now enable/disable highlighting of trailing whitespace.
- The default configuration-file is now embedded in the body of the editor, so you can run it portably.
- The keyboard input is better, allowing multi-character bindings.
- The following are possible, for example ^C, M-!, ^X^C, etc.
Most of the obvious things I use in Emacs are present, such as the ability to customize the status-bar (right now it shows the cursor position, the number of characters, the number of words, etc, etc).
Anyway I'll stop talking about it now :)
Here's the complete procedure I followed to replace a failed drive from a RAID array on a Debian machine.Replace the failed drive
After seeing that /dev/sdb had been kicked out of my RAID array, I used smartmontools to identify the serial number of the drive to pull out:smartctl -a /dev/sdb
Armed with this information, I shutdown the computer, pulled the bad drive out and put the new blank one in.Initialize the new drive
After booting with the new blank drive in, I copied the partition table using parted.
First, I took a look at what the partition table looks like on the good drive:$ parted /dev/sda unit s print
and created a new empty one on the replacement drive:$ parted /dev/sdb unit s mktable gpt
then I ran mkpart for all 4 partitions and made them all the same size as the matching ones on /dev/sda.
Finally, I ran toggle 1 bios_grub (boot partition) and toggle X raid (where X is the partition number) for all RAID partitions, before verifying using print that the two partition tables were now the same.Resync/recreate the RAID arrays
To sync the data from the good drive (/dev/sda) to the replacement one (/dev/sdb), I ran the following on my RAID1 partitions:mdadm /dev/md0 -a /dev/sdb2 mdadm /dev/md2 -a /dev/sdb4
and kept an eye on the status of this sync using:watch -n 2 cat /proc/mdstat
In order to speed up the sync, I used the following trick:blockdev --setra 65536 "/dev/md0" blockdev --setra 65536 "/dev/md2" echo 300000 > /proc/sys/dev/raid/speed_limit_min echo 1000000 > /proc/sys/dev/raid/speed_limit_max
Then, I recreated my RAID0 swap partition like this:mdadm /dev/md1 --create --level=0 --raid-devices=2 /dev/sda3 /dev/sdb3 mkswap /dev/md1
Because the swap partition is brand new (you can't restore a RAID0, you need to re-create it), I had to update two things:
- replace the UUID for the swap mount in /etc/fstab, with the one returned by mkswap (or running blkid and looking for /dev/md1)
- replace the UUID for /dev/md1 in /etc/mdadm/mdadm.conf with the one returned for /dev/md1 by mdadm --detail --scan
In order to be able to boot from both drives, I reinstalled the grub boot loader onto the replacement drive:grub-install /dev/sdb
before rebooting with both drives to first make sure that my new config works.
Then I booted without /dev/sda to make sure that everything would be fine should that drive fail and leave me with just the new one (/dev/sdb).
This test obviously gets the two drives out of sync, so I rebooted with both drives plugged in and then had to re-add /dev/sda to the RAID1 arrays:mdadm /dev/md0 -a /dev/sda2 mdadm /dev/md2 -a /dev/sda4
Once that finished, I rebooted again with both drives plugged in to confirm that everything is fine:cat /proc/mdstat
Then I ran a full SMART test over the new replacement drive:smartctl -t long /dev/sdb