Planet Python

Syndicate content
Planet Python -
Updated: 5 hours 33 min ago

Possbility and Probability: pip and private repositories: vendoring python

Wed, 2017-07-19 22:29

At work I am working on a project to migrate a series of Python apps into the cloud. Docker is a perfect fit for some of the apps, but one problem we ran into is getting our apps to build … Continue reading →

The post pip and private repositories: vendoring python appeared first on Possibility and Probability.

Categories: FLOSS Project Planets

Python Bytes: #35 How developers change programming languages over time

Wed, 2017-07-19 04:00
<p><strong>Brian #1:</strong> <a href=""><strong>Python Quirks</strong></a> <a href=""><strong>: Comments</strong></a></p> <ul> <li>Python developers put comments in their code.</li> </ul> <pre><code> # Like this """ And like this """ "And like this." ["Not usually like this","but it's possible"] </code></pre> <ul> <li>Philip Trauner timed all of these.</li> <li>Actual # comments are obviously way faster.</li> <li>He also shows the AST difference.</li> <li>Don’t abuse the language. Unused unreferenced strings are not free.</li> </ul> <p><strong>Michael #2:</strong> <a href=""><strong>Python 3.6.2 is out!</strong></a></p> <ul> <li><strong>Security</strong> <ul> <li>bpo-30730: Prevent environment variables injection in subprocess on Windows. Prevent passing other environment variables and command arguments.</li> <li>bpo-30694: Upgrade expat copy from 2.2.0 to 2.2.1 to get fixes of multiple security vulnerabilities including: CVE-2017-9233 (External entity infinite loop DoS), CVE-2016-9063 (Integer overflow, re-fix), CVE-2016-0718 (Fix regression bugs from 2.2.0’s fix to CVE-2016-0718) and CVE-2012-0876 (Counter hash flooding with SipHash). Note: the CVE-2016-5300 (Use os-specific entropy sources like getrandom) doesn’t impact Python, since Python already gets entropy from the OS to set the expat secret using XML_SetHashSalt().</li> <li>bpo-30500: Fix urllib.parse.splithost() to correctly parse fragments. For example, splithost('//') now correctly returns the host, instead of treating as the host in an authentification (login@host).</li> </ul></li> <li><strong>Core and Builtins</strong> <ul> <li>bpo-29104: Fixed parsing backslashes in f-strings.</li> <li>bpo-27945: Fixed various segfaults with dict when input collections are mutated during searching, inserting or comparing. Based on patches by Duane Griffin and Tim Mitchell.</li> <li>bpo-30039: If a KeyboardInterrupt happens when the interpreter is in the middle of resuming a chain of nested ‘yield from’ or ‘await’ calls, it’s now correctly delivered to the innermost frame.</li> <li>Library</li> <li>bpo-30038: Fix race condition between signal delivery and wakeup file descriptor. Patch by Nathaniel Smith.</li> <li>bpo-23894: lib2to3 now recognizes rb'...' and f'...' strings.</li> <li>bpo-24484: Avoid race condition in multiprocessing cleanup (#2159)</li> </ul></li> <li><strong>Windows</strong> <ul> <li>bpo-30687: Locate msbuild.exe on Windows when building rather than vcvarsall.bat</li> <li>bpo-30450: The build process on Windows no longer depends on Subversion, instead pulling external code from GitHub via a Python script. If Python 3.6 is not found on the system (via py -3.6), NuGet is used to download a copy of 32-bit Python.</li> </ul></li> <li><strong>Plus about 40 more fixes / changes</strong></li> </ul> <p><strong>Brian #3:</strong> <a href=""><strong>Contributing to Open Source Projects: Imposter Syndrome Disclaimer</strong></a></p> <ul> <li>“How to contribute” often part of OSS projects.</li> <li>Adrienne Lowe of has an “Imposter Syndrome Disclaimer” to include in your contributing documentation that’s pretty great.</li> <li>She’s also <a href="">collecting examples</a> of people using it, or similar.</li> <li>From the disclaimer: </li> </ul> <blockquote> <p>“<em>Imposter syndrome disclaimer</em>: I want your help. No really, I do. There might be a little voice inside that tells you you're not ready; that you need to do one more tutorial, or learn another framework, or write a few more blog posts before you can help me with this project. I assure you, that's not the case. … And you don't just have to write code. You can help out by writing documentation, tests, or even by giving feedback about this work. (And yes, that includes giving feedback about the contribution guidelines.)“</p> </blockquote> <p><strong>Michael #4:</strong> <a href=""><strong>The Dark Secret at the Heart of AI</strong></a></p> <ul> <li>via MIT Technology Review</li> <li>There’s a big problem with AI: even its creators can’t explain how it works</li> <li>Last year, an experimental vehicle, developed by researchers at the chip maker Nvidia, didn’t look different from other autonomous cars, but it was unlike anything demonstrated by Google, Tesla, or General Motors, and it showed the rising power of artificial intelligence. The car didn’t follow a single instruction provided by an engineer or programmer. Instead, it relied entirely on an algorithm that had taught itself to drive by watching a human do it.</li> <li>The result seems to match the responses you’d expect from a human driver. But what if one day it did something unexpected—crashed into a tree, or sat at a green light? </li> <li>As things stand now, it might be difficult to find out why.</li> <li>And you can’t ask it: there is no obvious way to design such a system so that it could always explain why it did what it did.</li> <li>There’s already an argument that being able to interrogate an AI system about how it reached its conclusions is a fundamental legal right</li> <li>We’ve never before built machines that operate in ways their creators don’t understand. How well can we expect to communicate—and get along with—intelligent machines that could be unpredictable and inscrutable</li> </ul> <p><strong>Brian #5:</strong> <a href=""><strong>Arrange Act Assert pattern for Python developers</strong></a></p> <ul> <li>James Cooke</li> <li>Good introduction to test case structure.</li> <li>Split your tests into setup, action, assertions.</li> <li>Pattern also known by: <ul> <li>Given, When, Then</li> <li>Setup, Test, Teardown</li> <li>Setup, Exercise, Verify, Teardown</li> </ul></li> <li>Also covered in: <ul> <li><a href=""></a></li> <li><a href=""></a></li> </ul></li> </ul> <p><strong>Michael #6:</strong> <a href=""><strong>Analyzing GitHub, how developers change programming languages over time</strong></a></p> <ul> <li>From source{d}: Building the first AI that understands code</li> <li>Have you ever been struggling with an nth obscure project, thinking : “I could do the job with this language but why not switch to another one which would be more enjoyable to work with” ?</li> <li>Derived from <a href=""><strong>The eigenvector of “Why we moved from language X to language Y”</strong></a><strong>,</strong> <a href=""><strong>Erik Bernhardsson</strong></a> <em>*</em>*</li> <li>Dataset available <ul> <li>4.5 Million GitHub users</li> <li>393 different languages</li> <li>10 TB of source code in total</li> </ul></li> <li>I find it nice to visualize developer’s language usage history with a kind of <a href=""><strong>Gantt diagram</strong></a>.</li> <li>We did not include Javascript because</li> <li>Most popular languages on GitHub</li> <li>At last! Here is the reward: the stationary distribution of our Markov chain. This probability distribution is independent of the initial distribution. It gives information about the stability of the process of random switching between languages. </li> <li><table> <thead> <tr> <th>Rank</th> <th>Language</th> <th>Popularity, %</th> <th>Source code, %</th> </tr> </thead> <tbody> <tr> <td>1.</td> <td>Python</td> <td>16.1</td> <td>11.3</td> </tr> <tr> <td>2.</td> <td>Java</td> <td>15.3</td> <td>16.6</td> </tr> <tr> <td>3.</td> <td>C</td> <td>9.2</td> <td>17.2</td> </tr> <tr> <td>4.</td> <td>C++</td> <td>9.1</td> <td>12.6</td> </tr> <tr> <td>5.</td> <td>PHP</td> <td>8.5</td> <td>24.4</td> </tr> <tr> <td>6.</td> <td>Ruby</td> <td>8.3</td> <td>2.6</td> </tr> <tr> <td>7.</td> <td>C#</td> <td>6.1</td> <td>6.5</td> </tr> </tbody> </table></li> <li><p>Python (16.1 %) appears to be the most attractive language, followed closely by Java (15.3 %). It’s especially interesting since only 11.3 % of all source code on GitHub is written in Python.</p></li> <li>Although there are ten times more lines of code on GitHub in PHP than in Ruby, they have the same stationary distribution.</li> <li>What about sticking to a language ? <ul> <li>Developers coding in one of the 5 most popular languages (Java, C, C++, PHP, Ruby) are most likely to switch to Python with approx. 22% chance on average.</li> <li>Similarly, a Visual Basic developer has more chance (24%) to move to C# while Erik’s is almost sure in this transition with 92% chance.</li> <li>People using numerical and statistical environments such as Fortran (36 %), Matlab (33 %) or R (40 %) are most likely to switch to Python in contrast to Erik’s matrix which predicts C as their future language.</li> </ul></li> </ul>
Categories: FLOSS Project Planets

Talk Python to Me: #121 Microservices in Python

Wed, 2017-07-19 04:00
Do you have big, monolith web applications or services that are hard to manage, hard to change, and hard to scale? Maybe breaking them into microservices would give you many more options to evolve and grow that app. <br/> <br/> This week, we meet up again with Miguel Grinberg to discuss the trades offs and advantages of microservices.<br/> <br/> Links from the show:<br/> <br/> <div style="font-size: .85em;"><b>Miguel on Twitter</b>: <a href="" target="_blank">@miguelgrinberg</a><br/> <b>Miguel's blog</b>: <a href="" target="_blank"></a><br/> <b>Microservices Tutorial at PyCon</b>: <a href="" target="_blank"></a><br/> <b>Flask Web Development (Amazon)</b>: <a href="" target="_blank"></a><br/> <b>Flask Web Development (O'Reilly)</b>: <a href="" target="_blank"></a><br/></div>
Categories: FLOSS Project Planets

Enthought: Webinar: Python for Scientists &amp; Engineers: A Tour of Enthought’s Professional Training Course

Tue, 2017-07-18 18:20

When: Thursday, July 27, 2017, 11-11:45 AM CT (Live webcast, all registrants will receive a recording)

What:  A guided walkthrough and Q&A about Enthought’s technical training course Python for Scientists & Engineers with Enthought’s VP of Training Solutions, Dr. Michael Connell

Who Should Attend: individuals, team leaders, and learning & development coordinators who are looking to better understand the options to increase professional capabilities in Python for scientific and engineering applications

REGISTER  (if you can’t attend we’ll send all registrants a recording)

“Writing software is not my job…I just have to do it every day.”  
-21st Century Scientist or Engineer

Many scientists, engineers, and analysts today find themselves writing a lot of software in their day-to-day work even though that’s not their primary job and they were never formally trained for it. Of course, there is a lot more to writing software for scientific and analytic computing than just knowing which keyword to use and where to put the semicolon.

Software for science, engineering, and analysis has to solve the technical problem it was created to solve, of course, but it also has to be efficient, readable, maintainable, extensible, and usable by other people — including the original author six months later!

It has to be designed to prevent bugs and — because all reasonably complex software contains bugs — it should be designed so as to make the inevitable bugs quickly apparent, easy to diagnose, and easy to fix. In addition, such software often has to interface with legacy code libraries written in other languages like C or C++, and it may benefit from a graphical user interface to substantially streamline repeatable workflows and make the tools available to colleagues and other stakeholders who may not be comfortable working directly with the code for whatever reason.

Enthought’s Python for Scientists and Engineers is designed to accelerate the development of skill and confidence in addressing these kinds of technical challenges using some of Python’s core capabilities and tools, including:

  • The standard Python language
  • Core tools for science, engineering, and analysis, including NumPy (the fast array programming package), Matplotlib (for data visualization), and Pandas (for data analysis); and
  • Tools for crafting well-organized and robust code, debugging, profiling performance, interfacing with other languages like C and C++, and adding graphical user interfaces (GUIs) to your applications.

In this webinar, we give you the key information and insight you need to evaluate whether Enthought’s Python for Scientists and Engineers course is the right solution to take your technical skills to the next level, including:

  • Who will benefit most from the course
  • A guided tour through the course topics
  • What skills you’ll take away from the course, how the instructional design supports that
  • What the experience is like, and why it is different from other training alternatives (with a sneak peek at actual course materials)
  • What previous course attendees say about the course


Presenter: Dr. Michael Connell, VP, Enthought Training Solutions

Ed.D, Education, Harvard University
M.S., Electrical Engineering and Computer Science, MIT

Python for Scientists & Engineers Training: The Quick Start Approach to Turbocharging Your Work

If you are tired of running repeatable processes manually and want to (semi-) automate them to increase your throughput and decrease pilot error, or you want to spend less time debugging code and more time writing clean code in the first place, or you are simply tired of using a multitude of tools and languages for different parts of a task and want to replace them with one comprehensive language, then Enthought’s Python for Scientists and Engineers is definitely for you!

This class has been particularly appealing to people who have been using other tools like MATLAB or even Excel for their computational work and want to start applying their skills using the Python toolset.  And it’s no wonder — Python has been identified as the most popular coding language for five years in a row for good reason.

One reason for its broad popularity is its efficiency and ease-of-use. Many people consider Python more fun to work in than other languages (and we agree!). Another reason for its popularity among scientists, engineers, and analysts in particular is Python’s support for rapid application development and extensive (and growing) open source library of powerful tools for preparing, visualizing, analyzing, and modeling data as well as simulation.

Python is also an extraordinarily comprehensive toolset – it supports everything from interactive analysis to automation to software engineering to web app development within a single language and plays very well with other languages like C/C++ or FORTRAN so you can continue leveraging your existing code libraries written in those other languages.

Many organizations are moving to Python so they can consolidate all of their technical work streams under a single comprehensive toolset. In the first part of this class we’ll give you the fundamentals you need to switch from another language to Python and then we cover the core tools that will enable you to do in Python what you were doing with other tools, only faster and better!

Additional Resources Upcoming Open Python for Scientists & Engineers Sessions:

Los Alamos, NM, Aug 14-18, 2017
Albuquerque, NM, Sept 11-15, 2017
Washington, DC, Sept 25-29, 2017
Los Alamos, NM, Oct 9-13, 2017
Cambridge, UK, Oct 16-20, 2017
San Diego, CA, Oct 30-Nov 3, 2017
Albuquerque, NM, Nov 13-17, 2017
Los Alamos, NM, Dec 4-8, 2017
Austin, TX, Dec 11-15, 2017

Have a group interested in training? We specialize in group and corporate training. Contact us or call 512.536.1057.

Learn More

Download Enthought’s Machine Learning with Python’s Scikit-Learn Cheat Sheets
Additional Webinars in the Training Series:

Python for Data Science: A Tour of Enthought’s Professional Technical Training Course

Python for Professionals: The Complete Guide to Enthought’s Technical Training Courses

An Exclusive Peek “Under the Hood” of Enthought Training and the Pandas Mastery Workshop

Download Enthought’s Pandas Cheat Sheets

The post Webinar: Python for Scientists & Engineers: A Tour of Enthought’s Professional Training Course appeared first on Enthought Blog.

Categories: FLOSS Project Planets

Anwesha Das: My Free Software journey, from today to yesteryear and back!

Tue, 2017-07-18 13:41

My life can be divided into major two parts.
Life before knowing “Libre Office” and after.
No, I won’t go on about how good the software is (which it is; fairly good).

I was fascinated by the name, the meaning.
Libre, means “free, at liberty.”
The name suggested a philosophy, a vision, an idea to me.

Libre != Gratis

Though I was introduced to the Free Software World briefly a while back at, 2012, a conference that altered the way I look at my soul-mate forever.

But the concept of open source, free software, the industry did not quite settle in clearly, in my mind. The past leads us to the present. So I became more interested in its history. The history of the movement of liberating cyber space. I asked Kushal, who happens to be a free software advocate, and a devout follower to tell me the story, and guide me through the path.

We embarked on a wonderfully meandering journey where I was visiting older, simpler times and Kushal was revisiting the movement with me. The era of sharing, the beginning of the complexities and the fight against it.

Our primary source of knowledge was the book, Hackers: Heroes of the Computer Revolution, apart from several old articles, newsletters etc.

I was astonished to know that there exists software that respects freedom; the freedom of end users.
There exists software that believes in cooperation and collective building.
By strict definition, the first one is known as free software and the second one is open source software.
To my utmost surprise there are people sacrificing their material wants, over the peace of mind, and easy fame to help others. To preach freedom and openness.

How did it help me?

It made me understand the present state of the world, a lot better.
I now see the words community, & collaboration in a different light altogether.
I now understand why people are so passionate about words, phrases or things.
Why these terms make all the difference.

Most importantly it gave me a refuge. I may not be someone with a coding background, but I can still very much be a part of this community. Coding may be a big part of the technical community but it’s not the only part. It gave me the power to ignore the cold shoulder and the occasional jibe (and laugh at their incomprehension about history), whenever someone treated me like an outsider. It gave myself the justification, and confidence to take the decision of quitting a fairly established career as mortgage attorney and start afresh at 30.

Community, collaboration, and freedom are the some my most used words (apart from “No Py”!) in a day since then.

And now I know why.
And it’s time for me to spread the word.

Coming to the present

July 2017, a tweet and its reply made us relive that journey. It was the discussion of the recent travel ban. A tweet came from @gnome about this. Totally appropriate given the primary idea of gnome. But many could not figure out why such statements are coming from @gnome.
I was taken aback. We were sad, surprised and somewhat angry. People were enjoying the fruits of so many people’s hard work and effort and were yet ignorant of the history and the toil behind it all.
Among many other replies, there was a reply from Miguel de Icaza that seemed to echo our (Kushal’s and mine) thoughts.

We understand why he said that. We thought it’s time for us too, to do our bit, to spread the word.
And so, we decided to do two things -

Firstly, to tell newer, younger lot about it.
For which Kushal & I have taken classes on
- the history of hacker ethics - licensing and how the choices we make affect us, the society, and the world at large, at DGPLUG Summer training session.
A similar session will follow in PyLadies Pune August meetup.

Secondly, to write an article on the Hacker Ethic and the Free Software movement, together.
We have published this on 12th July, 2017. Here is the link
. Please read it (though it is a bit lengthy) and be a part of our journey.

Categories: FLOSS Project Planets

Mike Driscoll: Python: All About Decorators

Tue, 2017-07-18 13:15

Decorators can be a bit mind-bending when first encountered and they can also be a bit tricky to debug. But they are a neat way to add functionality to functions and classes. Decorators are also known as a “higher-order function”. What this means is that they can take one or more functions as arguments and return a function as its result. In other words, decorators will take the function they are decorating and extend its behavior while not actually modifying what the function itself does.

There have been two decorators in Python since version 2.2, namely classmethod() and staticmethod(). Then PEP 318 was put together and the decorator syntax was added to make decorating functions and methods possible in Python 2.4. Class decorators were proposed in PEP 3129 to be included in Python 2.6. They appear to work in Python 2.7, but the PEP indicates that they weren’t accepted until Python 3, so I’m not sure what happened there.

Let’s start off by talking about functions in general to get a foundation to work from.

The Humble Function

A function in Python and in many other programming languages is just a collection of reusable code. Some programmers will take an almost bash-like approach and just write all their code out in a file with no functions at all. The code just runs from top to bottom. This can lead to a lot of copy-and-paste spaghetti code. When ever you see two pieces of code that are doing the same thing, they can almost always be put into a function. This will make updating your code easier since you’ll only have one place to update them.

Here’s a basic function:

def doubler(number): return number * 2

This function accepts one argument, number. Then it multiplies it be 2 and returns the result. You can call the function like this:

>>> doubler(5) 10

As you can see, the result will be 10.

Function are Objects Too

In Python, a lot of authors will describe a function as a “first-class object”. When they say this, they mean that a function can be passed around and used as arguments to other functions just as you would with a normal data type such as an integer or string. Let’s look at a few examples so we can get used to the idea:

>>> def doubler(number): return number * 2 >>> print(doubler) <function doubler at 0x7f7886b92f50> >>> print(doubler(10)) 20 >>> doubler.__name__ 'doubler' >>> doubler.__doc__ None >>> def doubler(number): """Doubles the number passed to it""" return number * 2 >>> doubler.__doc__ 'Doubles the number passed to it' >>> dir(doubler) ['__call__', '__class__', '__closure__', '__code__', '__defaults__', '__delattr__', '__dict__', '__doc__', '__format__', '__get__', '__getattribute__', '__globals__', '__hash__', '__init__', '__module__', '__name__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', 'func_closure', 'func_code', 'func_defaults', 'func_dict', 'func_doc', 'func_globals', 'func_name']

As you can see, you can create a function and then pass it to Python’s print() function or any other function. You will also note that once a function is defined, it automatically has attributes that we can access. For example, in the example above, we accessed func_doc which was empty at first. This attribute holds the contents of the function’s docstring. Since we didn’t have a docstring, it returned None. So we redefined the function to add a docstring and accessed func_doc again to see the docstring. We can also get the function’s name via the func_name attributes. Feel free to check out some of the other attributes that are shown in the last example above..

Our First Decorator

Creating a decorator is actually quite easy. As mentioned earlier, all you need to do to create a decorator is to create a function that accepts another function as its argument. Let’s take a look:

>>> def doubler(number): """Doubles the number passed to it""" return number * 2 >>> def info(func): def wrapper(*args): print('Function name: ' + func.__name__) print('Function docstring: ' + str(func.__doc__)) return func(*args) return wrapper >>> my_decorator = info(doubler) >>> print(my_decorator(2)) Function name: doubler Function docstring: Doubles the number passed to it 4

You will note that are decorator function, info(), has a function nested inside of it, called wrapper(). You can call the nested function whatever you like. The wrapper function accepts the arguments (and optionally the keyword arguments) of the function you are wrapping with your decorator. In this example, we print out the wrapped function’s name and docstring, if it exists. Then we return the function, calling it with its arguments. Lastly, we return the wrapper function.

To use the decorator, we create a decorator object:

>>> my_decorator = info(doubler)

Then to call the decorator, we call it just like we would a normal function: my_decorator(2).

However this is not the usual method of calling a decorator. Python has a special syntax just for that!

Using Decorator Syntax

Python allows you to call a decorator by using the following syntax: @info. Let’s update our previous example to use proper decorator syntax:

def info(func): def wrapper(*args): print('Function name: ' + func.__name__) print('Function docstring: ' + str(func.__doc__)) return func(*args) return wrapper   @info def doubler(number): """Doubles the number passed to it""" return number * 2   print(doubler(4))

Now you can call doubler() itself instead of calling the decorator object. The @info above the function definition tells Python to automatically wrap (or decorate) the function and call the decorator when the function is called.

Stacked Decorators

You can also stack or chain decorators. What this means is that you can use more than one decorator on a function at the same time! Let’s take a look at a silly example:

def bold(func): def wrapper(): return "<b>" + func() + "</b>" return wrapper   def italic(func): def wrapper(): return "<i>" + func() + "</i>" return wrapper   @bold @italic def formatted_text(): return 'Python rocks!'   print(formatted_text())

The bold() decorator will wrap the text with your standard bold HTML tags, while the italic() decorator does the same thing but with italic HTML tags. You should try reversing the order of the decorators to see what kind of effect it has. Give it a try before continuing.

Now that you’ve done that, you will have noticed that you Python appears to run the decorator closest to the function first and go up the chain. So in the version of code above, the text will get wrapped in italics first and then that text will get wrapped in bold tags. If you swap them, then the reverse will occur.

Adding Arguments to Decorators

Adding arguments to decorators is a bit different than you might think it is. You can’t just do something like @my_decorator(3, ‘Python’) as the decorator expects to take the function itself as it’s argument…or can you?

def info(arg1, arg2): print('Decorator arg1 = ' + str(arg1)) print('Decorator arg2 = ' + str(arg2))   def the_real_decorator(function):   def wrapper(*args, **kwargs): print('Function {} args: {} kwargs: {}'.format( function.__name__, str(args), str(kwargs))) return function(*args, **kwargs)   return wrapper   return the_real_decorator   @info(3, 'Python') def doubler(number): return number * 2   print(doubler(5))

As you can see, we have a function nested in a function nested in a function! How does this work? The function argument doesn’t even seem to be defined anywhere. Let’s remove the decorator and do what we did before when we created the decorator object:

def info(arg1, arg2): print('Decorator arg1 = ' + str(arg1)) print('Decorator arg2 = ' + str(arg2))   def the_real_decorator(function):   def wrapper(*args, **kwargs): print('Function {} args: {} kwargs: {}'.format( function.__name__, str(args), str(kwargs))) return function(*args, **kwargs)   return wrapper   return the_real_decorator   def doubler(number): return number * 2   decorator = info(3, 'Python')(doubler) print(decorator(5))

This code is the equivalent of the previous code. When you call info(3, ‘Python’), it returns the actual decorator function, which we then call by passing it the function, doubler. This gives us the decorator object itself, which we can then call with the original function’s arguments. We can break this down further though:

def info(arg1, arg2): print('Decorator arg1 = ' + str(arg1)) print('Decorator arg2 = ' + str(arg2))   def the_real_decorator(function):   def wrapper(*args, **kwargs): print('Function {} args: {} kwargs: {}'.format( function.__name__, str(args), str(kwargs))) return function(*args, **kwargs)   return wrapper   return the_real_decorator   def doubler(number): return number * 2   decorator_function = info(3, 'Python') print(decorator_function)   actual_decorator = decorator_function(doubler) print(actual_decorator)   # Call the decorated function print(actual_decorator(5))

Here we show that we get the decorator function object first. Then we get the decorator object which is the first nested function in info(), namely the_real_decorator(). This is where you want to pass the function that is being decorated. Now we have the decorated function, so the last line is to call the decorated function.

I also found a neat trick you can do with Python’s functools module that will make creating decorators with arguments a bit shorter:

from functools import partial     def info(func, arg1, arg2): print('Decorator arg1 = ' + str(arg1)) print('Decorator arg2 = ' + str(arg2))   def wrapper(*args, **kwargs): print('Function {} args: {} kwargs: {}'.format( function.__name__, str(args), str(kwargs))) return function(*args, **kwargs)   return wrapper   decorator_with_arguments = partial(info, arg1=3, arg2='Py')   @decorator_with_arguments def doubler(number): return number * 2   print(doubler(5))

In this case, you can create a partial function that takes the arguments you are going to pass to your decorator for you. This allows you to pass the function to be decorated AND the arguments to the decorator to the same function. This is actually quite similar to how you can use functools.partial for passing extra arguments to event handlers in wxPython or Tkinter.

Class Decorators

When you look up the term “class decorator”, you will find a mix of articles. Some talk about creating decorators using a class. Others talk about decorating a class with a function. Let’s start with creating a class that we can use as a decorator:

class decorator_with_arguments:   def __init__(self, arg1, arg2): print('in __init__') self.arg1 = arg1 self.arg2 = arg2 print('Decorator args: {}, {}'.format(arg1, arg2))   def __call__(self, f): print('in __call__') def wrapped(*args, **kwargs): print('in wrapped()') return f(*args, **kwargs) return wrapped   @decorator_with_arguments(3, 'Python') def doubler(number): return number * 2   print(doubler(5))

Here we have a simple class that accepts two arguments. We override the __call__() method which allows us to pass the function we are decorating to the class. Then in our __call__() method, we just print out that where we’re at in the code and return the function. This works in much the same way as the examples in the previous section. I personally like this method because we don’t have functions nested 2 levels inside another function, although some could argue that the partial example also fixed that issue.

Anyway the other use case that you will commonly find for a class decorator is a type of meta-programming. So let’s say we have the following class:

class MyActualClass: def __init__(self): print('in MyActualClass __init__()')   def quad(self, value): return value * 4   obj = MyActualClass() print(obj.quad(4))

That’s pretty simple, right? Now let’s say we want to add special functionality to our class without modifying what it already does. For example, this might be code that we can’t change for backwards compatibility reasons or some other business requirement. Instead, we can decorate it to extend it’s functionality. Here’s how we can add a new method, for example:

def decorator(cls): class Wrapper(cls): def doubler(self, value): return value * 2 return Wrapper   @decorator class MyActualClass: def __init__(self): print('in MyActualClass __init__()')   def quad(self, value): return value * 4   obj = MyActualClass() print(obj.quad(4)) print(obj.doubler(5)

Here we created a decorator function that has a class inside of it. This class will use the class that is passed to it as it’s parent. In other words, we are creating a subclass. This allows us to add new methods. In this case, we add our doubler() method. Now when you create an instance of the decorated MyActualClass() class, you will actually end up with the Wrapper() subclass version. You can actually see this if you print the obj variable.

Wrapping Up

Python has a lot of decorator functionality built-in to the language itself. There are @property, @classproperty, and @staticmethod that you can use directly. Then there is the functools and contextlib modules which provide a lot of handy decorators. For example, you can fix decorator obfuscation using functools.wraps or make any function a context manager via contextlib.contextmanager.

A lot of developers use decorators to enhance their code by creating logging decorators, catching exceptions, adding security and so much more. They are worth the time to learn as they can make your code more extensible and even more readable. Decorators also promote code reuse. Give them a try sometime soon!

Related Reading

Also include class decorators, functools.wrap, @contextlib, decorators with arguments

Python 201: Decorating the main function

Categories: FLOSS Project Planets

PyCharm: Webinar: “Teaching Python 3.6 with Games” with Paul Craven, August 2nd

Tue, 2017-07-18 05:14

Want to teach Python? Want to use Python 3 type hinting in your applications? Want to write games? In this webinar, we look at all three, together, using the Arcade library for Python and its creator, Paul Craven.

  • Wednesday, August 2nd
  • 17:00 European Time, 11AM Eastern Daylight Time
  • Register here

Paul Craven is the creator of Arcade, a 2d game library, and author of Program Arcade Games with Python and Pygame. He’s also a college professor teaching beginning programming using Python and game-writing.

He has settled on Arcade, Python 3.6 and PyCharm CE in his instruction. Arcade is written for the purpose of teaching, and thus heavily uses Python 3.6 type hinting. In this webinar, Paul will discuss:

  • Teaching programming using Python and game-writing
  • 2d games, and how Arcade was designed to help teach students
  • How Python 3.6 type hinting, combined with an IDE, helps both teaching and writing a framework
Speaking to you

Paul Vincent Craven (@professorcraven) is head of the Computer Science Department at Simpson College in Indianola, Iowa. He has a Ph.D. in computer science from the University of Idaho, and Masters from Missouri S&T. He worked for several years in IT before becoming a full-time professor. He finds that teaching students to program video games is a lot more fun than writing software to track mortgages.

Categories: FLOSS Project Planets

S. Lott: Yet Another Python Problem List

Tue, 2017-07-18 04:00
This was a cool thing to see in my Twitter feed:

Dan Bader (@dbader_org)6/19/17, 10:47 PM"Why Python Is Not My Favorite Language"…
More Problems with Python. Here's the short list.

1. Encapsulation (Inheritance, really.)
2. With Statement
3. Decorators
4. Duck Typing (and Documentation)
5. Types

I like these kinds of posts because they surface problems that are way, way out at the fringes of Python. What's important to me is that most of the language is fine, but the syntaxes for a few things are sometimes irksome. Also important to me is that it's almost never the deeper semantics; it seems to be entirely a matter of syntax.

The really big problem is people who take the presence of a list like this as a reason dismiss Python in its entirety because they found a few blog posts identifying specific enhancements. That "Python must be bad because people are proposing improvements" is madding. And dismayingly common.

Even in a Python-heavy workplace, there are Java and Node.js people who have opinions shaped by little lists like these. The "semantic whitespace" argument coming from JavaScript people is ludicrous, but there they are: JavaScript has a murky relationship with semi-colons and they're complaining about whitespace. Minifying isn't a virtue. It's a hack. Really.

My point in general is not to say this list is wrong. It's to say that these points are minor. In many cases, I don't disagree that these can be seen as problems. But I don't think they're toweringly important.

1. The body of first point seems to be more about inheritance and accidentally overiding something that shouldn't have been overridden. Java (and C++) folks like to use private for this. Python lets you read the source. I vote for reading the source.

2. Yep. There are other ways to do this. Clever approach. I still prefer with statements.

3. I'm not sold on the syntax change being super helpful.

4. People write bad documentation about their duck types. Good point. People need to be more clear.

5. Agree. A lot of projects need to implement type hints to make it more useful.
Categories: FLOSS Project Planets

Glyph Lefkowitz: Beyond ThunderDock

Tue, 2017-07-18 03:11

This weekend I found myself pleased to receive a Kensington SD5000T Thunderbolt 3 Docking Station.

Some of its functionality was a bit of a weird surprise.

The Setup

Due to my ... accretive history with computer purchases, I have 3 things on my desk at home: a USB-C macbook pro, a 27" Thunderbolt iMac, and an older 27" Dell display, which is old enough at this point that I can’t link it to you. Please do not take this to be some kind of totally sweet setup. It would just be somewhat pointlessly expensive to replace this jumble with something nicer. I purchased the dock because I want to have one cable to connect me to power & both displays.

For those not familiar, iMacs of a certain vintage1 can be jury-rigged to behave as Thunderbolt displays with limited functionality (no access from the guest system to the iMac’s ethernet port, for example), using Target Display Mode, which extends their useful lifespan somewhat. (This machine is still, relatively speaking, a powerhouse, so it’s not quite dead yet; but it’s nice to be able to swap in my laptop and use the big screen.)

The Link-up

On the back of the Thunderbolt dock, there are 2 Thunderbolt 3 ports. I plugged the first one into a Thunderbolt 3 to Thunderbolt 2 adapter which connects to the back of the iMac, and the second one into the Macbook directly. The Dell display plugs into the DisplayPort; I connected my network to the Ethernet port of the dock. My mouse, keyboard, and iPhone were plugged into the USB ports on the dock.

The Problem

I set it up and at first it seemed to be delivering on the “one cable” promise of thunderbolt 3. But then I switched WiFi off to test the speed of the wired network and was surprised to see that it didn’t see the dock’s ethernet port at all. Flipping wifi back on, I looked over at my router’s control panel and noticed that a new device (with the expected manufacturer) was on my network. nmap seemed to indicate that it was... running exactly the network services I expected to see on my iMac. VNCing into the iMac to see what was going on, I popped open the Network system preference pane, and right there alongside all the other devices, was the thunderbolt dock’s ethernet device.

The Punch Line

Despite the miasma of confusion surrounding USB-C and Thunderbolt 32, the surprise here is that apparently Thunderbolt is Thunderbolt, and (for this device at least) Thunderbolt devices connected across the same bus can happily drive whatever they’re plugged in to. The Thunderbolt 2 to 3 adapter isn’t just a fancy way of plugging in hard drives and displays with the older connector; as far as I can tell all the functionality of the Thunderbolt interface remains intact as both “host” and “guest”. It’s like having an ethernet switch for your PCI bus.

What this meant is that when I unplugged everything and then carefully plugged in the iMac before the Macbook, it happily lit up the Dell display, and connected to all the USB devices plugged into the USB hub. When I plugged the laptop in, it happily started charging, but since it didn’t “own” the other devices, nothing else connected to it.


This dock works a little bit too well; when I “dock” now I have to carefully plug in the laptop first, give it a moment to grab all the devices so that it “owns” them, then plug in the iMac, then use this handy app to tell the iMac to enter Target Display mode.

On the other hand, this does also mean that I can quickly toggle between “everything is plugged in to the iMac” and “everything is plugged in to the MacBook” just by disconnecting and reconnecting a single cable, which is pretty neat.

  1. Sadly, not the most recent fancy 5K ones. 

  2. which are, simultaneously, both the same thing and not the same thing. 

Categories: FLOSS Project Planets

Curtis Miller: Let’s Create Our Own Cryptocurrency

Mon, 2017-07-17 21:27
Originally posted on
I’ve been itching to build my own cryptocurrency… and I shall give it an unoriginal & narcissistic name: Cranky Coin. After giving it a lot of thought, I decided to use Python. GIL thread concurrency is sufficient. Mining might suffer, but can be replaced with a C mining module. Most…
Categories: FLOSS Project Planets

Daniel Bader: How to Check if a File Exists in Python

Mon, 2017-07-17 20:00
How to Check if a File Exists in Python

A tutorial on how to find out whether a file (or directory) exists using Python built-ins and functions from the standard library.

The ability to check whether a file exists on disk or not is important for many types of Python programs:

Maybe you want to make sure a data file is available before you try to load it, or maybe you want to prevent overwriting an existing file. The same is true for directories—maybe you need to ensure an output folder is available before your program runs.

In Python, there are several ways to verify a file or directory exists using functions built into the core language and the Python standard library.

In this tutorial you’ll see three different techniques for file existence checks in Python, with code examples and their individual pros and cons.

Let’s take a look!

Option #1: os.path.exists() and os.path.isfile()

The most common way to check for the existence of a file in Python is using the exists() and isfile() methods from the os.path module in the standard library.

These functions are available on Python 2 and 3, and they’re usually the first suggestion that comes up when you consult the Python docs or a search engine on how to solve this problem.

Here’s a demo of how to work with the os.path.exists() function. I’m checking several paths (files and directories) for existence in the example below:

>>> import os.path >>> os.path.exists('mydirectory/myfile.txt') True >>> os.path.exists('does-not-exist.txt') False >>> os.path.exists('mydirectory') True

As you just saw, calling os.path.exists() will return True for files and directories. If you want to ensure that a given path points to a file and not to a directory, you can use the os.path.isfile() function:

>>> import os.path >>> os.path.isfile('mydirectory/myfile.txt') True >>> os.path.isfile('does-not-exist.txt') False >>> os.path.isfile('mydirectory') False

With both functions it’s important to keep in mind that they will only check if a file exists—and not if the program actually has access to it. If verifying access is important then you should consider simply opening the file while looking out for an I/O exception (IOError) to be raised.

We’ll come back to this technique in the summary at the end of the tutorial. But before we do that, let’s take a look at another option for doing file existence checks in Python.

Option #2: open() and try...except

You just saw how functions in the os.path module can be used to check for the existence of a file or a folder.

Here’s another straightforward Python algorithm for checking whether a file exists: You simply attempt to open the file with the built-in open() function, like so:

>>> open('does-not-exist.txt') FileNotFoundError: "[Errno 2] No such file or directory: 'does-not-exist.txt'"

If the file exists the open call will complete successfully and return a valid file handle. If the file does not exist however, a FileNotFoundError exception will be raised:

“FileNotFoundError is raised when a file or directory is requested but doesn’t exist. Corresponds to errno ENOENT.” (Source: Python Docs)

This means you can watch for this FileNotFoundError exception type in your own code, and use it to detect whether a file exists or not. Here’s a code example that demonstrates this technique:

try: f = open('myfile.txt') f.close() except FileNotFoundError: print('File does not exist')

Notice how I’m immediately calling the close() method on the file object to release the underlying file handle. This is generally considered a good practice when working with files in Python:

If you don’t close the file handle explicitly it is difficult to know when exactly it will be closed automatically by the Python runtime. This wastes system resources and can make your programs run less efficiently.

Instead of closing the file explicitly with the close() method, another option here would be to use the context manager protocol and the with statement to auto-close the file.

Now, the same “just attempt to open it” technique also works for ensuring a file is both readable and accessible. Instead of watching for FileNotFoundError exceptions you’ll want to look out for any kind of IOError:

try: f = open('myfile.txt') f.close() except IOError: print('File is not accessible') print('File is accessible')

If you frequently use this pattern you can factor it out into a helper function that will allow you to test whether a file exists and is accessible at the same time:

def is_accessible(path, mode='r'): """ Check if the file or directory at `path` can be accessed by the program using `mode` open flags. """ try: f = open(path, mode) f.close() except IOError: return False return True

Alternatively, you can use the os.access() function in the standard library to check whether a file exists and is accessible at the same time. This would be more similar to using the os.path.exists() function for checking if a file exists.

Using open() and a try...except clause has some advantages when it comes to file handling in Python. It can help you avoid bugs caused by file existence race conditions:

Imagine a file exists in the instant you run the check, only to get removed a millisecond later. When you actually want to open the file to work with it, it’s gone and your program aborts with an error.

I’ll cover this edge case in some more detail in the summary below. But before we get down another rabbit hole—let’s take a look at one more option for checking if a file or folder exists in Python.

Option #3: pathlib.Path.exists() (Python 3.4+)

Python 3.4 and above include the pathlib module that provides an object-oriented interface for dealing with file system paths. Using this module is much nicer than treating file paths as simple string objects.

It provides abstractions and helper functions for many file system operations, including existence checks and finding out whether a path points to a file or a directory.

To check whether a path points to a valid file you can use the Path.exists() method. To find out whether a path is a file or a symbolic link, instead of a directory, you’ll want to use Path.is_file().

Here’s a working example for both pathlib.Path methods:

>>> import pathlib >>> path = pathlib.Path('myfile.txt') >>> path.exists() True >>> path.is_file() True

As you can tell, this approach is very similar to doing an existence check with functions from the os.path module.

The key difference is that pathlib provides a cleaner object-oriented interface for working with the file system. You’re no longer dealing with plain str objects representing file paths—but instead you’re handling Path objects with relevant methods and attributes on them.

Using pathlib and taking advantage of its object-oriented interface can make your file handling code more readable and more maintainable. I’m not going to lie to you and say this is a panacea. But in some cases it can help you write “better” Python programs.

The pathlib module is also available as a backported third-party module on PyPI that works on Python 2.x and 3.x. You can find it here: pathlib2

Summary: Checking if a File Exists in Python

In this tutorial we compared three different methods for determining whether a file exists in Python. One method also allowed us to check if a file exists and is accessible at the same time.

Of course, with three implementations to choose from you might be wondering:

What’s the preferred way to check if a file exists using Python?

In most cases where you need a file existence check I’d recommend you use the built-in pathlib.Path.exists() method on Python 3.4 and above, or the os.path.exists() function on Python 2.

However, there’s one important caveat to consider:

Keep in mind that just because a file existed when the check ran won’t guarantee that it will still be there when you’re ready to open it:

While unlikely under normal circumstances, it’s entirely possible for a file to exist in the instant the existence check runs, only to get deleted immediately afterwards.

To avoid this type of race condition, it helps to not only rely on a “Does this file exist?” check. Instead it’s usually better to simply attempt to carry out the desired operation right away. This is also called an “easier to ask for forgiveness than permission” (EAFP) style that’s usually recommended in Python.

For example, instead of checking first if a file exists before opening it, you’ll want to simply try to open it right away and be prepared to catch a FileNotFoundError exception that tells you the file wasn’t available. This avoids the race condition.

So, if you plan on working with a file immediately afterwards, for example by reading its contents or by appending new data to it, I would recommend that you do the existence check via the open() method and exception handling in an EAFP style. This will help you avoid race conditions in your Python file handling code.

If you’d like to dig deeper into the subject, be sure to watch my YouTube tutorial on file existence checks in Python. It’s also embedded at the top of the article. Happy Pythoning!

Categories: FLOSS Project Planets

Matthew Rocklin: Scikit-Image and Dask Performance

Mon, 2017-07-17 20:00

This weekend at the SciPy 2017 sprints I worked alongside Scikit-image developers to investigate parallelizing scikit-image with Dask.

Here is a notebook of our work.

Categories: FLOSS Project Planets

NumFOCUS: NumFOCUS Projects at SciPy 2017

Mon, 2017-07-17 18:21
SciPy 2017 is a wrap! As you’d expect, we had lots of participation by NumFOCUS sponsored projects. Here’s a collection of links to talks given by our projects:   Tutorials Software Carpentry Scientific Python Course Tutorial by Maxim Belkin (Part 1) (Part 2) The Jupyter Interactive Widget Ecosystem Tutorial by Matt Craig, Sylvain Corlay, & Jason […]
Categories: FLOSS Project Planets

Weekly Python Chat: Ranges in Python

Mon, 2017-07-17 12:00

Want to loop over consecutive numbers? Backwards? In steps of 3? You want range.

The range function changed quite a bit between Python 2 and Python 3. We're going to discuss the behaviors of range and xrange in Python 2 and compare each to range in Python 3.

If you think xrange in Python 2 is the same as range in Python 3, you're in for a surprise.

Sign up and ask a range-related question!

Categories: FLOSS Project Planets

PyCharm: PyCharm 2017.2 EAP 7

Mon, 2017-07-17 11:19

The release of PyCharm 2017.2 is only a couple of weeks away. This week we have the seventh early access program (EAP) version ready for you, go to our website to get it now!

New in this version:

  • Many bugs have been fixed
  • Any virtualenvs detected for a certain project will now be visible only for that project
  • For more details, see the release notes.

Please let us know how you like it! Users who actively report about their experiences with the EAP can win prizes in our EAP competition. To participate: just report your findings on YouTrack, and help us improve PyCharm.

To get all EAP builds as soon as we publish them, set your update channel to EAP (go to Help | Check for Updates, click the ‘Updates’ link, and then select ‘Early Access Program’ in the dropdown). If you’d like to keep all your JetBrains tools updates, try JetBrains Toolbox!

-PyCharm Team
The Drive to Develop

Categories: FLOSS Project Planets

Doug Hellmann: math — Mathematical Functions — PyMOTW 3

Mon, 2017-07-17 09:00
The math module implements many of the IEEE functions that would normally be found in the native platform C libraries for complex mathematical operations using floating point values, including logarithms and trigonometric operations. Read more… This post is part of the Python Module of the Week series for Python 3. See for more articles … Continue reading math — Mathematical Functions — PyMOTW 3
Categories: FLOSS Project Planets

Kushal Das: Encrypting drives with LUKS

Mon, 2017-07-17 05:10

Encrypting hard drives should be a common step in our regular computer usage. If nothing else, this will help you sleep well, in case you lose your computer (theft) or that small USB disk you were carrying in your pocket. In this guide, I’ll explain how to encrypt your USB disks so that you have peace of mind, in case you lose them.

But, before we dig into the technical details, always remember the following from XKCD.

What is LUKS?

LUKS or Linux Unified Key Setup is a disk encryption specification, first introduced in 2004 by Clemens Fruhwirth. Notice the word specification; instead of trying to implement something of its own, LUKS is a standard way of doing drive encryption across tools and distributions. You can even use drives from Windows using the LibreCrypt application.

For the following example, I am going to use a standard 16 GB USB stick as my external drive.

Formatting the drive

Note: check the drive name/path twice before you press enter for any of the commands below. A mistake, might destroy your primary drive, and there is no way to recover the data. So, execute with caution.

In my case, the drive is detected as /dev/sdb. It is always a good idea to format the drive before you start using it. You can use wipefs tool to clean any signature from the device,

$ sudo wipefs -a /dev/sdb1

Then you can use fdisk tool to delete the old partitions , and create a new primary partition.

Next step is to create the LUKS partition.

$ sudo cryptsetup luksFormat /dev/sdb1 WARNING! ======== This will overwrite data on /dev/sdb1 irrevocably. Are you sure? (Type uppercase yes): YES Enter passphrase: Verify passphrase: Opening up the encrypted drive and creating a filesystem

Next, we will open up the drive using the passphrase we just gave, and create a filesystem on the device.

$ sudo cryptsetup luksOpen /dev/sdb1 reddrive Enter passphrase for /dev/sdb1 $ ls -l /dev/mapper/reddrive lrwxrwxrwx. 1 root root 7 Jul 17 10:18 /dev/mapper/reddrive -> ../dm-5

I am going to create an EXT4 filesystem on here.
Feel free to create which ever filesystem you want.

$ sudo mkfs.ext4 /dev/mapper/reddrive -L reddrive mke2fs 1.43.4 (31-Jan-2017) Creating filesystem with 3815424 4k blocks and 954720 inodes Filesystem UUID: b00be39d-4656-4022-92ea-6a518b08f1e1 Superblock backups stored on blocks: 32768, 98304, 163840, 229376, 294912, 819200, 884736, 1605632, 2654208 Allocating group tables: done Writing inode tables: done Creating journal (16384 blocks): done Writing superblocks and filesystem accounting information: done Mounting, using, and unmounting the drive

The device is now ready to use. You can manually mount it with the mount command. Any of the modern desktops will ask you to unlock using the passphrase if you connect the device (or try to double click on the file browser).

I will show the command line option. I will create a file hello.txt as an example.

$ sudo mount /dev/mapper/reddrive /mnt/red $ su -c "echo hello > /mnt/red/hello.txt" Password: $ ls -l /mnt/red total 20 -rw-rw-r--. 1 root root 6 Jul 17 10:26 hello.txt drwx------. 2 root root 16384 Jul 17 10:21 lost+found $ sudo umount /mnt/red $ sudo cryptsetup luksClose reddrive

When I attach the drive to my system, the file browser asks me to unlock it using the following dialog. Remember to choose forget immediately so that the file browser forgets the password.

On passphrases

The FAQ entry on cryptsetup page, give us hints and suggestions about passphrase creation.

If paranoid, add at least 20 bit. That is roughly four additional characters for random passphrases and roughly 32 characters for a random English sentence.

Key slots aka different passphrases

In LUKS, we get 8 different key slots (for passphrases) for each device(partition). You can see them using luksDump sub-command.

$ sudo cryptsetup luksDump /dev/sdb1 | grep Slot Key Slot 0: ENABLED Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED Adding a new key

The following command adds a new key to the drive.

$ sudo cryptsetup luksAddKey /dev/sdb1 -S 5 Enter any existing passphrase: Enter new passphrase for key slot: Verify passphrase:

You will have to use any of the existing passphrases to add a new key.

$ sudo cryptsetup luksDump /dev/sdb1 | grep Slot Key Slot 0: ENABLED Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: ENABLED Key Slot 6: DISABLED Key Slot 7: DISABLED Removing a passphrase

Remember that removing a passphrase is based on the passphrase itself, not by the key slot number.

$ sudo cryptsetup luksRemoveKey /dev/sdb1 Enter passphrase to be deleted: $ sudo cryptsetup luksDump /dev/sdb1 | grep Slot Key Slot 0: ENABLED Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED

Now in case you don’t know the passphrase, then you can use luksKillSlot.

$ sudo cryptsetup luksKillSlot /dev/sdb1 3 Enter any remaining passphrase: Overview of the disk layout

The disk layout looks like the following. The header or phdr contains various details like magic value, version, cipher name, following the 8 keyblocks (marked as kb1, kb2.. in the drawing), and then the encrypted bulk data block. We can see all of those details in the C structure.

struct luks_phdr { char magic[LUKS_MAGIC_L]; uint16_t version; char cipherName[LUKS_CIPHERNAME_L]; char cipherMode[LUKS_CIPHERMODE_L]; char hashSpec[LUKS_HASHSPEC_L]; uint32_t payloadOffset; uint32_t keyBytes; char mkDigest[LUKS_DIGESTSIZE]; char mkDigestSalt[LUKS_SALTSIZE]; uint32_t mkDigestIterations; char uuid[UUID_STRING_L]; struct { uint32_t active; /* parameters used for password processing */ uint32_t passwordIterations; char passwordSalt[LUKS_SALTSIZE]; /* parameters used for AF store/load */ uint32_t keyMaterialOffset; uint32_t stripes; } keyblock[LUKS_NUMKEYS]; /* Align it to 512 sector size */ char _padding[432]; };

Each (active) keyblock contains an encrypted copy of the master key. When we enter the passphrase, it unlocks the master key, that in turn unlocks the encrypted data.

But, remember, all of this is of no use if you have a very simple passphrase. We have another XKCD to explain this.

I hope this post encourages you to use encrypted drives more. All of my computers have their drives encrypted; (I do that while installing the Operating System.) This means, without decrypting the drive you can not boot the system properly. On a related note, remember to turn off your computer completely, (not hibernation or suspend mode) when you’re traveling.

Categories: FLOSS Project Planets

"Fredrik Håård's Blaag": Another kind of nomad?

Mon, 2017-07-17 05:00

I used to enjoy a lot of the writing about digital nomadry, but the more I've started living the life I'm aiming for, the more what I read seems to describe an adolescent "workation" while depleting your saving rather than a sustainable career and lifestyle. In the interest of banging my own drum giving a different perspective to a life of travel, I'd like to share our journey so far.

Planning and preparation

We started planning our mobile lifestyle over ten years ago, when we realized that house, office jobs and kids weren't our dream, and also that the weeks we spent in our (8.5m small) sailing boat was when we really felt at home. Now, we did not just save some money to go on an extended cruise, but rather started planning our life and careers to be able to start a potentially never-ending cruise. Instead, we planned our careers - Veronica runs a business offering IT support to logistics providers called Visual Units and I'm a contractor.

Around this time I started spending more time building my personal brand, going to conferences to speak, getting involved in open source, taking care of my LinkedIn profile, and in general spending a bit of extra effort not just to be good at my job, but to be seen as being good at my job. Since I'm not amused by sales, I figured that the best route would be to try to let clients find me rather than the other way around.

In spring, 2013, I resigned from a job and an employer I still liked very much, but which did not and could not provide me with the freedom I wanted. I started my own company (my wife was already self-employed), we sold our apartment, and took a leap into the unknown.

A short leap

Well, we took a short leap. My first contract was for my previous employer, and we did not have a boat at this time, so we rented a cottage outside of Karlskrona, just some ten kilometers from where we used to live. During that winter, I finished up my contracts and we found and bought the boat we wanted - not the biggest, but one we can live on and could afford without taking a loan. In spring 2014 we packed our things in a one-way trailer, returned the key to the cottage, and left for the Swedish west coast, where our new home had just been prepared, masted, and put into the water.

A steady pace

Since then, we've been on the move, more or less. Northern European winters being what they are, we've rented places in winter, mostly coinciding with on-site contracts for me. We've lived most of the winter in Stockholm (twice) and Edinburgh, and we've traveled by car and AirBnB - a few weeks at most - in Germany, Italy, France, England and Scotland.

But it's the summers we're planning for. Not quite aimlessly drifting, we've slowly taken us from the Swedish west coast up to the very far north of the Baltic sea. We generally (try to) get up early, walk the dogs, eat breakfast and then set sail. Around noon we'll reach whatever goal we had in mind for the day, have lunch, and then set to work. We don't exactly have a normal work week - hours depend on customers needs more than day of the week, but generally we both work around 20-30 hours a week - although like most self-employed I know of, sometimes we also go way over 40 hours for a short while.

When winter comes, we'll put the boat on land, wherever we happen to find ourselves, figure out how to get our car, pack up, and find a place - or more likely a couple of places - to stay until spring. South of France seems likely right now. Or somewhere else entirely. Next winter, we hope to avoid leaving the boat by sailing to the Mediterranean instead. In time, we'll probably buy a slightly bigger boat, that we can stay in even during a Swedish winter.

Money in, money out

Of course - living in Chiag Mai is probably cheaper than our current setup. On the other hand, with no mortgage to pay it's not as much as you might think, and with our slow pace moving around we can get actual, proper work done that pays actual, proper rates. Remote work does not pay as much as say, on-site work in Stockholm city, but on the other hand our summer rent averages out around 300 EUR a month, all inclusive. Staying in Europe has an extra benefit on the income front - 4G/LTE network is commonplace, and with the roaming charges gone, working remote has never been easier. A mid-range 4G modem and an extra MIMO-antenna, and we're set to work almost anywhere.

In the end, we're not trying to live on a shoestring and not trying (hard) to get rich, but rather to balance the amount and type of work we do with he money we need to sustain our lifestyle in the long run.

Right now, I'm working through Toptal on a project for an American customer - and my total sales investment was showing up for the video interview while we were living in a mountain cottage above the Aosta Valley, in the Italian alps. I can take making a few percent less for that kind of freedom.

Family, flock and friends

People seem to assume that living and working as close as we do - and we're very rarely apart for more than a couple of hours - would bring problems. I can only respond that we had probably not gotten married unless we enjoyed spending time together.

Living and working together on 12 square meters does lead to friction, but on the other hand you learn to resolve your conflicts pretty fast when there's nowhere to run and sulk. Besides, it upsets our dogs something fierce when we have an argument, and we don't want to upset the dogs do we?

Speaking of the dogs, taking a long walk with the dogs is an excellent way to get some of that alone time I think everybody needs, and with the bonus to give some exercise and force you to take a break no matter how deep in work you're buried. As an added bonus, the dogs help us find new friends wherever we go, and we've now got friends spread over Europe, and come winter we have more opportunity than most to visit.


Right now, we've got no great plans to change our lives - while we've not run out of things we'd like to do, there are very few, if any, things we feel we need to do or change. Our long-term plans (anything more than a month or so) are ever fluid, but that's OK, because with nothing to hold us down, we can go with the flow.

I'm planning to do a few shorter write-ups on specific subjects around remote work, nomadry and freelancing, and if and when I get around to it I will update this post with links as well.

This has turned into a - for me - monster of a blog post, so a huge thank you if you made it this far!

Categories: FLOSS Project Planets

Python Software Foundation: Welcome New Board Members

Mon, 2017-07-17 03:30
The PSF is thrilled to welcome six new board members, chosen on June 11 during the 2017 PSF Board Election. The PSF would not be what it is without the expertise and diversity of our board, and we look forward to seeing what our new members accomplish this quarter. Read on to learn more about them and their initial goals as PSF Board Members.

Paul Hildebrandt has been a Senior Engineer with Walt Disney Animation Studios since 1996. He resides outside of Los Angeles with his wife and three boys. In his first quarter, he hopes to serve the Python community by better understanding the well-oiled machine that is the PSF and by handling regular board activity. He desires to contribute by focusing on sponsorship and corporate involvement opportunities.

Eric Holscher is co-founder of Read the Docs and Write the Docs, where he works to elevate the status of documentation in the software industry. He has hiked 800 miles of the Pacific Crest Trail, and spends most of his spare time in the woods or traveling the world. His wish is to focus on sustainability and to create a new initiative that will bring in sponsors who are focused on the sustainability of the ecosystem such as PyPI, Read the Docs, and pip.

Marlene Mhangami is the director and co-founder of ZimboPy, an organization that teaches Zimbabwean girls how to code in Python. Through her organization she has worked with the organizers of Django Girls Chinoyi and Harare, as well as PyCon Zimbabwe to grow the use of Python locally. Her goals for the quarter are to help connect, support, and represent issues relevant to Pythonistas in Africa. She will seek to increase the number of PyCons in the region and facilitate the inclusion of women and other underrepresented groups.

Paola Katherine Pacheco is a backend Python developer and organizer of Python groups such as PyLadies Brazil, PyLadies Rio de Janeiro, Django Girls Rio de Janeiro, Pyladies Mendoza and Python Mendoza. She runs a YouTube channel where she teaches Python in Portuguese. Her goals this quarter are to energize Python events for the Brazilian and Argentine Python communities, and to increase diversity by promoting education and events to women and underrepresented groups.

Kenneth Reitz is the product owner of Python at Heroku. He is well-known for his many open source software projects, specifically Requests: HTTP for Humans. He seeks to contribute towards the PSF's continued optimization of its operations, increase its sustainability, and the sustainability of the entire Python ecosystem.
Thomas Wouters is a long-time CPython core developer and a founding PSF member. He has worked at Google since 2006, maintaining the internal Python infrastructure. His immediate goal is to get reacquainted with the PSF procedures and the matters the board attends to, both of which have changed a lot since he last served on the Board of Directors. Longer term, he would like to work on the awareness of the practical side of the Python community: the website, mailing lists, and other help channels like IRC, as well as actual code development and services like PyPI.
Categories: FLOSS Project Planets

Python Insider: Python 3.6.2 is now available

Sun, 2017-07-16 21:43
Python 3.6.2 is now available.   Python 3.6.2 is the second maintenance release of Python 3.6, which was initially released in 2016-12 to great interest.   With the release of 3.6.2, we are now providing the second set of bugfixes and documentation updates to 3.6.  Detailed information about the changes made in 3.6.2 can be found in its change log.  See the What’s New In Python 3.6 document for more information about features included in the 3.6 series.

You can download Python 3.6.2 here.  The next maintenance release is expected to follow in about 3 months, around the end of 2017-09.
Categories: FLOSS Project Planets