Feeds

FSF Events: Richard Stallman - "Free Software in Ethics and in Practice" (Rennes, France)

GNU Planet! - Thu, 2019-03-14 13:46
Richard Stallman will speak about the goals and philosophy of the Free Software Movement, and the status and history of the GNU operating system, which in combination with the kernel Linux is now used by tens of millions of users world-wide.

Richard Stallman's speech will be nontechnical, admission is gratis, and the public is encouraged to attend.

Location: University of Rennes 1, Rennes, France (exact location to be determined)

Please fill out our contact form, so that we can contact you about future events in and around Rennes.

Categories: FLOSS Project Planets

Amazee Labs: Drupal Mountain Camp 2019 Recap

Planet Drupal - Thu, 2019-03-14 12:52
Drupal Mountain Camp 2019 Recap

This year saw the return of Drupal Mountain Camp in Davos, Switzerland, with people from all over the world attending this three-day camp. I was joined by my fellow colleagues, Victor, Daniel, Maria, Bastian, and Michael.

Vijay Dubb Thu, 03/14/2019 - 17:52 Day one

The weather was snowy and cold and caused some transportation delays. Though I arrived later than planned, I was able to attend afternoon workshops and work on some Drupal Contribution projects. The day finished with an Apéro, where everyone gathered for good drinks and great conversations. To finish the night, the Amazee team left to have dinner together.

Day two

The 2nd day of Drupal Mountain Camp began with a discussion by a panel made up of Nick Veenhof, Imre Gmelig Meijling, Yauhen Zenko and Vincent Maucorps, to discuss the future of Drupal communities. The panel was moderated by Rachel Lawson. They took questions from the audience members, which included:

  • How can we attract young talent?
    By targeting students, as they have the most time. Having a DrupalCamp in a university shows them what the community has and can to offer. This can also be achieved by getting Universities to add Drupal to the course curriculum. Another way is offering a training initiative or talking to agencies.
  • What can we do about International collaborations?
    Related to the previous question, maybe offer a base camp or training day. This allows those who wouldn’t be able to attend a larger event to learn. Live streaming is a good option for those not able to attend in person.
  • What are the benefits of sponsoring events, such as Drupal Mountain Camp?
    Sponsoring is a great way to find talent and increase brand recognition, particularly to companies that are new.
GraphQL 101: What, Why, How

This session was presented by fellow colleague Maria Comas, as a beginner’s guide to GraphQL. Throughout the presentation, it became clear why GraphQL is so powerful. I really liked the abbreviation WYGISWAF (What You Get Is What You Asked For), because that is essentially what GraphQL does. It will return only the data that you need. Maria showed us how this is achieved through demo code, before letting people know that this is already available in Drupal through the GraphQL module. As it was International Women’s Day, it was fitting that Maria ended the session with the following quote by computer scientist, Grace Hopper.

The most damaging phrase in the language is "We’ve always done it this way!"
- Grace Hopper
 

Mob Programming: An interactive session

Next was Daniel Lemon, whose session was all about mob programming. Having already introduced a scrum team to mob programming, Daniel wanted to share the experience. This presentation gave a broad overview of mob programming. What impressed me most about this session was that Daniel didn't just want to explain to the audience what mob programming is, but got members of the audience to participate in a live mob session. This meant that those involved and those watching could see how mob programming works.

Participants were tasked with creating a gallery page and menu to the Drupal Mountain Camp site, within 15 minutes, taking turns of 2 minutes each, being the driver or navigator. After introducing the task, the 5 participants were able to create a basic implementation of the gallery page. The session ended with a quick retrospective, in which participants were truly motivated to try this within their own company. Many felt it was a nice switch from the ordinary single-developer experience, but some observed it could be difficult to keep up especially in the role of the driver.

Splash awards, fondue, sledding, and drinks!

The Splash Awards is about awarding the best Drupal projects of the year. Amazee Labs won an award for Zuerich.com in the category of Design/UX.

During the awards, Jeffrey McGuire treated us to sounds from the Alphorn, which I, personally, had never heard before. The sound produced was truly beautiful. After the awards, everyone made their way to the funicular station to collect their sleds and made their way up to the Belle Epoque restaurant. I was unable to go sledding as I didn’t have the right footwear, so I went to eat fondue with fellow colleagues Victor, Bastian, and Michael. There really is nothing better than ending the day with fondue.

Day three

Day three started with a keynote, presented by Matthew Grill about the Drupal Admin UI & JavaScript Modernisation initiative, in which he informed us about the current progress of the administration. After the initial showing at DrupalEurope, it was clear that existing modules wouldn’t be compatible. This led to the team creating extension points, which would allow current modules to bundle and transpile the JavaScript to be used with the AdminUI, without having an extra build step.

It was clear that this was still a work in progress but nonetheless, it was nice to hear the latest update about the initiative. After the session, everyone was invited to the group photo. Say “Drupal”!

Current state of the Drupal Admin UI Redesign

The next session was again about the Drupal Admin UI, however, this time about the design. This was given by Sascha Eggenberger and Cristina Chumillas, they both explained and showcased the new design system, wireframes, and the current state of designs the initiative is proposing. It was clear that the design process was long and opinionated after they explained that designing a button wasn’t as straightforward as expected, due to many states and types. The team are hoping for a release in Drupal 8.7. but it was clear, after someone asked, that it seems to be a slow process, that this might not happen in time. It was noted that they also need help from contributors.

If you want to help or just know more about the above, head to the Admin UI & JavaScript Modernisation initiative.

Optimise your JavaScript

Saša Nikolić gave his session on optimising JavaScript. After a short history of the internet, in which I learned that Drupal came before Facebook. Saša also covered data loading. Loading lots of data, with lots of data manipulation is not a good idea for the user as this will slow down page loads.

The session also explained how to address various scenarios and the general rules that every JavaScript developer should be familiar with in order to boost your site’s performance. This includes using tools like Google Chrome dev tools, and Lighthouse. Tree shaking was another suggestion, by including only the functions that are needed. I also came to learn about prepack, a JavaScript bundle optimiser. Another useful piece of advice was to utilise CSS. Why use JavaScript for animations when CSS can take care of this? If unsupported browsers are the reason, leave it out, and make it look graceful as possible. I also enjoyed the joke about “eval() = bad”.

Network was the bottleneck, now it’s JavaScript.
- Saša Nikolić

Open source contribution

This was my favourite session of the day in which I learned about the opinions of Christina Chumillas, Miro Dietiker, Kevin Wenger, Michael Schmid, and Lukas Smith about everything to do with open source. This was an open forum, moderated by Josef Dabernig, in which an audience member was encouraged to ask a question they had about open source.

  • What motivates you to contribute to open source?
    It is concrete, you can see what you have done. People will code review, this will not only help make it better but will make oneself better. On a side note, people should just work together, join forces, this is the mindset of Drupal.
  • What is the advantage of open source software over proprietary software?
    Not only does it help with the maintenance of the code, but having different backgrounds, helps with the innovation of the code. Proprietary software means being on your own, which sometimes is not productive.
  • What is a good way to avoid maintainer burnout?
    Having a coach is a good way to let them, and other people, know of any problems and get help from them. Avoid those that don't have your best interest at heart. Share the knowledge, don't let one person do everything, and don’t let yourself be only one to complete someone just for the credit.

It was really nice to hear those answers and I couldn’t agree more. As someone who loves to contribute to open source, I think the biggest benefit is that your code will only become stronger if you share your code with others. After all, two heads are better than one.

Closing

Lukas Smith gave a very thought-provoking and inspiring closing session titled "Diversity & Inclusion: Why and How?". Lukas shared personal insights into becoming active in improving diversity and inclusiveness. He challenged the audience with some shocking statistics on the low amount of female to male programmers across Switzerland and the United States and then revealed that in open-source this percentage is even lower.

What can we do to better ourselves and improve Diversity? He also finished off the session with several tips to improve Diversity, some of which I find important to highlight:

  • Challenge your cognitive biases.
  • Consider following specifically people from marginalized communities in your chosen field.
  • Believe when members of marginalized communities point out issues with bias even if you have never encountered them.
  • Work on using inclusive language.

While talking about inclusion, I, along with everyone who attended, was happy to see that there were three sign language interpreters at the event. This meant that those who are deaf or with hearing difficulties were not excluded from the camp. This was another reason why this camp was exceptional.

If someone points out an offensive statement, make an effort to not become defensive. Listen, learn, move on.
- Lukas Kahwe Smith

After the closing everyone was invited for the ice hockey match between HC Davos and Rapperswil. This was my first time watching an ice hockey game, so it was wonderful to attend. It was a great match, with both a great atmosphere and great people. With that ended the great weekend that was Drupal Mountain Camp. I can honestly say that I had such a great time, especially spending time with my team and the Drupal community.

Finally, you hear it all the time, “thank you to all the sponsors”, but honestly, it cannot be expressed enough. Without them, great camps like Drupal Mountain Camp wouldn’t be possible.

Categories: FLOSS Project Planets

Sooper Drupal Themes: 8 Ways a Drupal Page Builder can drive Business Value

Planet Drupal - Thu, 2019-03-14 11:56

Using Drupal as your default CMS undoubtedly has advantages, however it also comes with its negative sides. The price you have to pay for its customizability, is the complexity and steep learning curve. Here at Sooperthemes, we have thought of you and developed an easy-to-use solution for you: Glazed Builder. With this visual Drupal page builder, you and your team of content creators and marketeers will be able to create rich content and beautiful web pages for your business, without having to touch a line of code.

In this article, I present to you 8 ways through which a visual page builder like Glazed Builder can further create value for your business.

  1. Cut in half your landing page costs and time-to-market

Having a good landing page is paramount to the success of your business. However, it takes plenty of time and money to find the right people and tools to do it. With Glazed Builder as your Drupal 8 page builder however, creating a landing page has never been easier, cheaper and faster. Content creators and marketeers will be able to to create a visually stunning landing page in a matter of minutes, without having to rely on the IT department.

2. Stress less: Reduce employee turnover in your content team with true WYSIWYG

Are your employees stressed that the webpage they are building is going to look completely different than they imagined? Well, with Glazed Builder, your content creators will experience true WYSIWYG (what you see is what you get). That means that whatever they have imagined for your webpage is going to be their final result. No more senseless stress for your content creating team.

3. Get twice as much Drupal site-building work done by your most expensive staff: Developers

Developers, they are the most expensive members of your staff. However, they do not get work done as fast as you would like. The way to increase productivity is to have developers use Glazed Builder as your default Drupal page builder to build dynamic pages and dashboards that leverage drupal's block and views systems. This way, you will make their job easier while also increasing their productivity.

4. Same-day web design and publishing by using the pre-built templates

You need to launch a webpage in a matter of hours and you don’t have the inspiration necessary to design a layout? Fret not, Glazed Builder, the Drupal page builder, has you covered. With a plethora of templates available, you just have to select the right template for your business, insert your content, and post it. It has never been easier.

5. Content creators will produce better, more effective content than your competitors.

Do you want to stand-out from your competition in terms of content creation? Glazed Builder can help you and your content creators unleash their creativity. With an endless amount of customizability, Glazed Builder is sure to provide the right tools and power for your content creators to achieve their wildest dreams. When it comes to customizability, with Glazed Builder, the sky's the limit.

6. Reduce onboarding time and training costs: Reduce Drupal’s steep learning curve for content creators and marketeers

Every time there is a new tool introduced to your business, you have to pay a large amount of money for training your employees. The same is applicable for Drupal, since it is a highly complex CMS, it has a steep learning curve and requires highly skilled developers to be able to make it truly shine. However, Glazed Builder was engineered to be able to be used by even the most non-tech savvy of its users. This way, your staff will be able to quickly understand how to operate the visual builder and you will be able to reduce the time and money spent on training your personnel.

7. Save thousands on cloud hosting costs with a frontend tool that runs in your browser, not in your cloud

If you're thinking that a Drupal 8 website with the additional features of Glazed Builder requires a beefy server, you're wrong! 90% of Glazed Builder's magic is happening in the browser. Even our Drupal 8 demo sites with hundreds of demo content items run perfectly fine on affordable shared hosting solutions like our favorite Drupal hosting A2Hosting.

8. Better performance attracts a bigger number of visitors on your webpage

Even if you have top-notch content on your website, it’s irrelevant when it takes a long time to load. Most site visitors don’t have patience when it comes to loading a webpage, they would simply exit and visit the next one if it takes too much time. However, Drupal is the fastest out of the bunch when it comes to speed. It takes the least amount of time to load a page, which means that the likelihood of visitors leaving significantly drops.

  Conclusion on Drupal Page Builder

Now that you know all of this, what are you waiting for?

Start improving your business today by using our visual page like Glazed Builder.

Categories: FLOSS Project Planets

Stack Abuse: Introduction to Python FTP

Planet Python - Thu, 2019-03-14 11:20
Introduction

In this tutorial, we will explore how to use FTP with Python to send and receive files from a server over TCP/IP connections.

To make things easier and more abstract, we will be using Python's ftplib library which provides a range of functionalities that make it easier to work with FTP. We'll see the implementation for uploading and downloading files from the server, as well as some other cool things that "ftplib" allows us to do.

What is FTP?

FTP stands for File Transfer Protocol; it is based on the client-server model architecture and is widely used. It has two channels; a command channel and a data channel. The command channel is used to control the communication and the data channel is used for the actual transmission of files. There's a wide range of things that you can do using FTP, like moving, downloading, copying files, etc. We will discuss that in a later section, along with the details on how to do it using Python.

Working with FTP in Python

Moving on, you would be happy to know that ftplib is a built-in library that comes already installed with Python, all you need to do is import it in your script and you can start using its functions. To import it, use the following command:

from ftplib import FTP

After that, we need to initiate a connection to the FTP server that we want to open a communication link with. To do so, create an ftp instance:

# Replace the example domain with your domain name ftp = FTP('ftp.example.com')

The above method uses the default port, i.e. port 21, for establishing a connection with the server. Next step is to provide your login credentials, i.e. your username and password, to get access to the files on the server. You can use the following method for that:

ftp.login('your_username','your_password')

The default values for username and password are 'anonymous' and 'anonymous@', respectively. If the connection is successful, you should receive a message similar to "230 Login Successful".

Now that we have established a connection to the server, we want to navigate to the directory where we wish to do operations, i.e. get or write a file in. For that, we change the 'current working directory' using the following command:

ftp.cwd('/path/to/the/directory/')

Let's now discuss some basic examples of how to get a file from a directory or write a file to a directory. The explanation of the code is provided in the comments alongside each line of code:

file_name = 'a-filename.txt' my_file = open(file_name, 'wb') # Open a local file to store the downloaded file ftp.retrbinary('RETR ' + file_name, my_file.write, 1024) # Enter the filename to download

In the retrbinary call above, the 1024 means that the file will be downloaded in blocks of 1024 bytes until the whole file is transferred.

There's one more thing that you need to do after downloading or uploading a file - close that file and also close the FTP connection that you had opened. You can do that for the above example with the following two lines of code:

ftp.quit() # Terminate the FTP connection my_file.close() # Close the local file you had opened for downloading/storing its content

Let's now try to upload a file to the server. In addition to the commands below you would also have to rewrite the commands we used above to open an FTP connection.

file_name = 'a-filename.txt' ftp.storbinary('STOR ' + file_name, open(file_name, rb))

In the above examples, 'rb' and 'wb' mean "read binary" and "write binary", respectively.

Additional FTP Functionalities

Now that we have discussed the implementation for the main features, let's now see some additional functionality that ftplib provides us.

List Files and Directories

To see the files and folders in your current working directory, in list format, run the retrlines command:

ftp.retrlines('LIST') Make a New Directory

In order to organize your files in a certain manner, you might feel the need to create a new directory on the server, which you can do using a single line of code:

ftp.mkd('/path/for/the/directory')

The path would be the location at which you wish the new directory to be located at.

Delete a File from the Server

Removing a file on the server is fairly simple, you just have to give the name of the file as a parameter to the delete function. The operation's success or failure will be conveyed by a response message.

ftp.delete('file_name_to_delete') Check Current Path

To check your current path, simply run the following code:

ftp.pwd()

This command will return the absolute path to the current working directory.

Caution

It is important to note that while FTP is quite secure itself, it is not commonly used to transfer sensitive information; if you're transferring something like that then you should go for more secure options like SFTP (Secure FTP) or SSH (Secure Shell). These are the most commonly used protocols for handling sensitive data transmission.

Conclusion

In this post, we discussed what FTP is and how it works with the help of different examples. We also saw how to use Python's "ftplib" module to communicate with a remote server using FTP and saw some other functionalities that the module offers. In the end, we also discussed some more secure alternatives to FTP, such as SFTP and SSH, which are used for the transfer of sensitive information.

For more information on using FTP with Python, see the the official ftplib docs or RFC 959.

Categories: FLOSS Project Planets

Dataquest: Tutorial: Why Functions Modify Lists and Dictionaries in Python

Planet Python - Thu, 2019-03-14 11:01

In this beginner Python tutorial, we'll take a look at mutable and immutable data types, and learn how to keep dictionaries and lists from being modified by our functions.

The post Tutorial: Why Functions Modify Lists and Dictionaries in Python appeared first on Dataquest.

Categories: FLOSS Project Planets

TEN7 Blog's Drupal Posts: Becoming a TEN7 Support Client: You Can’t Just Give Us Cash

Planet Drupal - Thu, 2019-03-14 10:09

TEN7 is a full-service digital firm, and our tagline is “We create and care for Drupal-powered websites.” Creating and building a website is the sexy visible part, but caring for a website over time is the just-as-important maintenance work. Keeping your site code updated, backed up, secure and performing well is the job of a support team.

Categories: FLOSS Project Planets

Chromatic: Creating and Using Entity Storage Methods

Planet Drupal - Thu, 2019-03-14 09:47

Entity storage methods are an often used, yet easily overlooked tool for improving data retrieval and code architecture.

Categories: FLOSS Project Planets

Blair Wadman: Guide to Ultimate Cron

Planet Drupal - Thu, 2019-03-14 08:06

Drupal comes with its own built in cron. This means that you can add your own job to the list of jobs that are executed when the Drupal cron runs.

Categories: FLOSS Project Planets

OpenSense Labs: Everything you need to know about Blue-Green Deployment and Drupal

Planet Drupal - Thu, 2019-03-14 07:32
Everything you need to know about Blue-Green Deployment and Drupal Vasundhra Thu, 03/14/2019 - 17:02

There have been a lot of people that are very much interested in the “DevOps” concept and when I sat down with some of these, the direction of the conversation went down to many interesting paths. 

They started talking about deployment best practices, rollbacks, hot deployment etc. 


But, when there were some mentions about “Blue-Green Deployment” - complete silence. 

Therefore, this gave me an idea to tell the rest of the world that with all the microservices, native cloud and what not technology, blue-green deployment is not a silver bullet, but it is an element to usefulness.

How?

Well, you got to read ahead. 

What do we understand by blue-green deployment?

A blue-green deployment is a management approach for releasing software code. 

Two identical hardware environments are configured in the exact same way in Blue-green deployments, which is also known as A/B deployments 

Only one of the environments is live at a single time, where the live environment serves all the production traffic. For example, if blue is currently live then green would be idle and vice-versa.

Blue-green deployments are usually utilized for consumer-facing applications and the applications which have critical uptime requirements. The new code is delivered to the inactive environment, where it is completely tested. 

How it reduces the risk?

Achieving automation and continuous delivery at any level of production is a holy grail, and avoiding downtimes and risks are high up on the list of priorities. Blue-green deployment provides you with simple ways of achieving these goals by eliminating risks that are witnessed in the deployment. 

  • You will never encounter surprise errors

When you fill a particular form online, what all credentials do you fill? Your name, phone number, address, street and probably your bank details if you are making an online purchase. Right?

You press the “pay now” button and check on the “receive spam emails” but unfortunately, your order wasn’t able to get processed as you desired. If you are lucky enough you get an error message equivalent to “application is offline for maintenance” all your efforts and time goes in vain. But with blue-green deployment, you never have to worry about this maintenance screen. 

There is a list of item’s upon one click and upon next click, you are eligible to see the new menu that you add. This would keep furious emails about error screen from flooding your inbox. 

  • Testing the production environment 

Ensuring that your pre-production environment is as close to your production environment as possible is not only important but essential too. With the help of blue-green deployment, this task is easily achievable. The user can test any application while it is disconnected from the main traffic. The team has the eligibility to even load the test if they desire too. 

  • Makes sure that the traffic is seamless 

Customer needs and desires are more global than ever and there is no longer an essential good time to do deployment, especially if you work in an enterprise where the business needs to be running around the clock. If you have a customer facing application then there are chances that they might switch their platform to some other website, if they don’t find what they desire. This means a decrease in sale and business. 

Blue-green deployment assures that your traffic never stops. That customer can place their order just fine without disruption. Which means that the employees overseas continue to do their job without any interruption, saving companies money. 

  • Easy Recovery 

You might witness times where you would get introduced to bugs and viruses. We can either spend a lot of money on its fix or we can inevitably find them and recover them. With the help of blue-green deployment, we have our older and more stable version of our applications to come back online at a moment’s notice by evading the pain to roll back a deployment.

Source: Martin FowlerHow does this process work?

As we know that blue-green deployment technique involves running two identical production environments where they are configured in the same way, therefore, let us assume that the current deployment is in the green environment in 2.3 release. The next deployment which would take place would be in a blue environment that would be in 2.4 release.  

The environment would then be tested and evaluated until it is confirmed to be stable and responding. Once it is in production the server would be redirected, thus becoming the new production environment that the users are routed to.

The entire design is used to provide fast rollbacks in a case a deployment fails or does not pass a QA. When deployment fails or critical bugs are identified, a rollback to the green environment will be initiated. Once the bugs are fixed the version is re-deployed to the blue environment and the traffic is rerouted back the moment it is stable. 

While deploying the preceding version i.e version 2.5, the deployment would switch to the green environment and would be extensively be tested and evaluated. Traffic would be rerouted to the green zone once it passes the quality assessment.

This way both green and blue environment are regularly cycled between live versions and staging to the next version. 

Source: Medium Blue-Green Deployment helping your Drupal websites

Let us imagine that you constructed a website with the help of Drupal, now you are getting high traffic in it. Normally for developing, updating and testing a website (without risking the live integrity), you follow these steps:

Development: The development process starts with developers working on new features, bug fixes, theming and configuration in the local environment. It makes it possible to easily roll back to the previous stage of development.
 
Testing: Typically this environment is not available for client viewing and it is intended for testing developmental work against a lateral host. 

Staging: This stage is used for presenting the changes to the client for approval. QA (quality assurance) and UAT (user acceptance testing) are most often carried out on the staging stage. 

Production: This is the live site on the web available visitors. It contains new features that have been proven safe to go live. 

As you can see that this process can be long and time-consuming, maintaining and constructing site can be irritating therefore blue-green deployment rescues you at times like these. 

It would provide near to zero downtime and would present easy rollbacks capabilities. The fundamental idea behind blue/green deployment is to shift traffic between two identical environments that running differently in different applications. 

Source: NewGenappsSome of the implementations for Your Drupal Website 

Blue-Green Deployment for Drupal websites with Docker 

Drupal Deployments are hard. The user has to make sure that that the code is deployed, composer dependencies are pulled, schema updates are pulled, scheme updates are performed and all the caches are cleared. 

All with keeping the website up and responsive to the users. But if anything goes wrong and you wish to rollback? Do you stop the deployment? Well, no blue-green deployment is the answer to it. 

Docker makes it easy to build, shift and run applications. On the EC2 instance, there are always two raised docker containers of “blue” and “green”, and ngnix works as a reverse proxy on the same instance. The user can build a Drupal site that is running parallelly in the “blue” and “green” environment and serve both from MySQL database. we install Apache, PHP, and Drupal in baseimage-docker.

Source: Nulab

Drupal with Blue-Green Deployment in AWS Beanstalk 

Within the help of ECS, the user can create task definitions, which are very similar to a docker-compose.yml file. 

A task definition is a collection of the container, each of which has a name, the Docker image runs, and have the option to override the image’s entry point and command. The container definition is also where the user can define environment variables, port mappings, volumes to mount, memory and CPU allocation, and whether or not the specific container should be considered essential, which is how ECS knows whether the task is healthy or needs to be restarted.

The Amazon web service solution allows the user to quickly and easily manage the deployment and scalability of web platforms. The deployment helps in configuring a high-availability environment that seamlessly runs a Drupal website. Running a DB instance that is external to Elastic beanstalk decouples the database from the lifecycle of the environment, and lets the user connect to the same database from multiple environments, swap out one database from another and perform a blue-green deployment without affecting the database.

The below image shows how green-blue deployment work in AWS environment. 

Source: CloudNativeSome of the best practices for smooth release 

Now that we understand how blue-green deployment works, let’s cover some of the best practices that are related to it:

Load Balancing

Load balancing helps you to automatically set a new server without depending on any other mechanism, without depending on the DNS mechanism. The DNS record will always point to the Load Balancer and the user would only modify the servers behind it. This way they can be absolutely sure that all traffic comes to the new production environment instead of the old one.

Rolling Update

To avoid downtime the user can execute rolling update which means instead of switching from all blue server to all green server in a single cut-off you are eligible to work with an integrated environment. This indicates that rather than switching from all blue servers to all green servers in a single cut-off, the user can control with an integrated environment

Monitoring the environment 

Monitoring the productive as well as the non-productive environment is important. Since the same environment can play both as production and as non-production, all you would need is to toggle the alerting between the two states. 

Automate

The user can script as many actions as possible in the witch process, instead of doing a manual set of actions. This brings huge benefits. The process becomes quicker, easier, safer and enables self-service.

Deployment in cloud

If your servers run in the cloud, there is an interesting variation of the Blue-Green method in which instead of going back and forth between two static environments, you can just create the next environment from scratch.

This process is also valuable for avoiding the danger of servers becoming snowflakes, which are servers that have a unique configuration set that isn’t documented anywhere.  Once these snowflakes get erased for some reason, you have no easy way to properly recreate them. Whatever may be the choice it is important to keep the newest test and release technology to ensure that the release is smooth.

Conclusion 

Deployments are one of the most important parts of the software development lifecycle, therefore all the activities involved should thoroughly be researched and tested to ensure that they are a perfect fit for your system architecture and business. 

At OpenSense Labs, we have a pool of Drupal developers and experts that work on technologies that use these tools and services. Contact us now at hello@opensenselabs.com, our experts would guide you with the queries and questions that are related to this topic. 

blog banner blog image Drupal Drupal8 CMS Blue-Green Deployment Docker AWS Beanstalk Blog Type Articles Is it a good read ? On
Categories: FLOSS Project Planets

Craig Small: WordPress 5.1.1

Planet Debian - Thu, 2019-03-14 07:09

The Debian packages for WordPress version 5.1.1 are being updated as I write this. This is a security fix for WordPress that stops comments causing a cross-site scripting bug. It’s an important one to update.

The backports should happen soon so even if you are using Debian stable you’ll be covered.

Categories: FLOSS Project Planets

James Duncan: Of course Kubernetes isn’t always the answer

Planet Apache - Thu, 2019-03-14 06:40

Kubernetes is the new hotness, for sure, and pretty awesome to boot. But is it really needed, especially when you’re bootstrapping a new idea? Maybe not. UK-based Freetrade started out with a decently designed Kubernetes-based stack, complete with all the bells and whistles, but ended up scrapping that plan and launching with Firebase functions.

Even now at 20k users, it’s been the right decision for them.

In fact, not only did we not need it - but if we’d launched with this stack with just two engineers (our launch team size!) I’m confident our customers would have been very unhappy.

As they continue to grow, I’m sure that Freetrade may end up putting some functionality into on-demand container instances or even spin up a Kubertnetes cluster to handle the parts of their app that need it. When they do, however, I’m sure they’ll be able to leave a significant portion of their application surface in functions.

Categories: FLOSS Project Planets

Krita 4.2.0: the First Painting Application to bring HDR Support to Windows

Planet KDE - Thu, 2019-03-14 05:53

We’re deep in bug fixing mode now, because in May we want to release the next major version of Krita: Krita 4.2.0. While there will be a host of new features, a plethora of bug fixes and performance improvements, one thing is unique: support for painting in HDR mode. Krita is the very first application, open source or proprietary, that offers this!

So, today we release a preview version of Krita 4.2.0 with HDR support baked in, so you can give the new functionality a try!

Of course, at this moment, only Windows 10 supports HDR monitors, and only with some very specific hardware. Your CPU and GPU need to be new enough, and you need to have a monitor that supports HDR. We know that the brave folks at Intel are working on HDR support for Linux, though!

What is HDR?

HDR stands for High Dynamic Range. The opposite is, of course, Low Dynamic Range.

Now, many people when hearing the word “HDR” will think of the HDR mode of their phone’s cameras. Those cameras map together images taken at different exposure levels in one image to create, within the small dynamic range of a normal image, the illusion of a high dynamic range image, with often quite unnatural results.

This is not that! Tone-mapping is old-hat. These days, manufacturers are bringing out new monitors that can go much brighter than traditional monitors, up to a 1000 nits (a standard for brightness), or even brighter for professional monitors. And modern systems, Intel 7th generation Core CPU’s, support these monitors.

And it’s not just brightness, these days most normal monitors are manufactured to display the sRGB gamut. This is fairly limited, and lacks quite a bit of greens (some profession monitors have a wider gamut, of course). HDR monitors use a far wider gamut, with the Rec. 2020 colorspace. And instead of using traditional exponential gamma correction, they use Perceptual Quantizer (PQ), which not just extends the dynamic range to sun-bright values, but also allows to encode very dark areas, not available in usual sRGB.

And finally, many laptop panels only support 6 bits per channel; most monitors only 8 bits, monitors for graphics professionals 10 bits per channel — but HDR monitors support from 10 to 16 bits per channel. This means much nicer gradations.

It’s early days, though, and from a developers’ point of view, the current situation is messy. It’s hard to understand how everything fits together, and we’ll surely have to update Krita in the future when things straighten out, or if we discover we’ve misunderstood something.

So… The only platform that supports HDR is Windows 10 via DirectX. Linux, nope, OpenGL, nah, macOS, not likely. Since Krita speaks OpenGL, not DirectX, we had to hack the OpenGL to DirectX compatibility layer, Angle, to support the extensions needed to work in HDR. Then we had to hack Qt to make it possible to convert the UI (things like buttons and panels) from sRGB to p2020-pq, while keeping the main canvas unconverted. We had to add, of course, a HDR capable color selector. All in all, quite a few months of hard work.

That’s just the technical bit: the important bit is how people actually can create new HDR images.

So, why is this cool?

You’ve got a wider range of colors to work with. You’ve got a wider range of dark to light to work with: you can actually paint with pure light, if you want to. What you’re doing when creating an HDR image is, in a way, not painting something as it should be shown on a display, but as the light falls on a scene. There’s so much new flexibility here that we’ll be discovering new ways to make use of it for quite some time!

If you’d have a HDR compatible set-up, and a browser that would support HDR, well, this video could be in HDR:

(Image by Wolthera van Hövell tot Westerflier)

And how do I use it?

Assuming you have a HDR capable monitor, a DisplayPort 1.4 or HDMI 2.0a (the ‘a’ is important!) or higher cable, the latest version of Windows 10 with WDDM 2.4 drivers and a CPU and GPU that supports hits, this is how it works:

You have to switch the display to HDR mode manually, in the Windows settings utility. Now Windows will start talking to the display in p2020-pq mode. To make sure that you don’t freak out because everything looks weird, you’ll have to select a default SDR brightness level.

You have to configure Krita to support HDR. In the Settings → Configure Krita → Display settings panel you need to select your preferred surface. You’ll also want to select the HDR-capable small color selector from the Settings → Dockers menu.

To create a proper HDR image, you will need to make a canvas using a profile with the rec 2020 gamut and a linear tone-response-curve: “Rec2020-elle-V4-g10.icc” is the profile you need to choose. HDR images are standardized to use the Rec2020 gamut, and the PQ trc. However, a linear TRC is easier to edit images in, so we don’t convert to PQ until we’re satisfied with our image.

Krita’s native .kra file format can save HDR images just fine. You should use that as your working format. For sharing with other image editors, you should use the OpenEXR format. For sharing on the Web, you can use the expanded PNG format. Since all this is so new, there’s not much support for that standard yet.

You can also make HDR animations… Which is pretty cool! And you can export your animations to mp4 and H.265. You need a version of FFMpeg that supports H.265. And after telling Krita where to find that, it’s simply a matter of:

  • Have an animation open.
  • Select File → Render Animation
  • Select Video
  • Select Render as MPEG-4 video or Matroska
  • Press the configure button next to the fileformat dropdown.
  • Select at the top ‘H.265, MPEG-H Part 2 (HEVC)’
  • Select for the Profile: ‘main10’.
  • The HDR Mode checkbox should now be enabled: toggle it.
  • Click ‘HDR Metadata’ to configure the HDR metadata
  • Finally, when done, click ‘render’.

If you have a CPU of 7th gen Core or later with intel Graphics integrated you can take advantage of HW accelerated encode to save time in the export stage: ffmpeg does that for you.

Download

Sorry, this version of Krita is only useful for Windows users. Linux graphics developers, get a move on!

Windows
Categories: FLOSS Project Planets

James Duncan

Planet Apache - Thu, 2019-03-14 05:45

Helping companies through the Microsoft for Startups program has reinforced my feeling that many problems companies face as they grow are really people problems in disguise. To help figure out what a startup needs to focus on at each phase of the startup lifecycle, Wendy van Ierschot gives us a road map.

We’ve identified five stages, each pegged to a certain number of employees. For each stage, we’ve established which areas an organisation should put its HR focus on.

The diagram in the article showing what to focus on at each stage feels about right, and it’s a better starting point to implementing HR strategy in a startup than having nothing at all.

Categories: FLOSS Project Planets

Python Anywhere: System update this morning

Planet Python - Thu, 2019-03-14 05:44

This morning's system update went smoothly :-)

It was primarily a maintenance update, bringing our US-based system up to the same version our EU-based system. There were a number of minor bugfixes, along with a bunch of improvements to our system administration tools, which won't be visible to you, but do mean that we'll be able to spend less time on admin stuff -- which gives us more time to work on adding cool new features!

Categories: FLOSS Project Planets

New package in Fedora: python-xslxwriter

Planet KDE - Thu, 2019-03-14 05:42

XlsxWriter is a Python module for creating files in xlsx (MS Excel 2007+) format. It is used by certain python modules some of our customers needed (such as OCA report_xlsx module).

This module is available in pypi but it was not packaged for Fedora. I’ve decided to maintain it in Fedora and created a package review request which is helpfully reviewed by Robert-André Mauchin.

The package, providing python3 compatible module, is available for Fedora 28 onwards.

Categories: FLOSS Project Planets

Kristof De Jaeger: First stable release of the IndieWeb module for Drupal 8

Planet Drupal - Thu, 2019-03-14 04:48

About a year ago, I only just learned about the principles of IndieWeb, which in a way is a bit of a shame. Fast forward to now, and I'm proud to announce the first stable release for Drupal 8. Together with this milestone, I also pushed a new version of Indigenous so that both are aligned feature wise.

It's been a great journey so far, and while there's still a lot to do for both projects, the stability and feature set warrants a stable tag. It has changed the way I interact with (social) media day to day now since the last half year, both in reading and posting, being in full control of every aspect. It's great, everyone should try it!

What's next?

I've been thinking the last few weeks to raise funding, but after much consideration, I'm not going forward on that path. Even though my public GitHub profile lists over 1300 contributions the last year (about 3.5 per day), which somehow is simply crazy, I still have more than enough spirit and motivation to keep on going. Just a little slower from now on, since many features for both projects are not mission critical - even though they are awesome. Of course, I won't mind if someone would suddenly feel the urge to sponsor me.

Slowing down now you think, that can't be true ? Right. As already announced a few weeks ago, the next focus will be writing an Activitypub module for Drupal so you can communicate with your site on the Fediverse. I'm currently using Bridgy Fed for this, but, in the IndieWeb spirit, it's time to bring this home!

But first, time to make sure I don't mess up my tryouts of the Moonlight sonata. No commits until after March 31st - I promise :)

Categories: FLOSS Project Planets

Agiledrop.com Blog: Top Drupal blog posts from February 2019

Planet Drupal - Thu, 2019-03-14 03:39

We’re back with an overview of the top Drupal blog posts from last month. Have a read and get yourself up to speed on the most recent goings-on within the Drupal community!

READ MORE
Categories: FLOSS Project Planets

Timothy Chen: The power of choice in data-aware cluster scheduling

Planet Apache - Thu, 2019-03-14 00:39

In this post we’ll cover a scheduler called KMN that is looking to solve scheduling I/O intensive tasks in distributed compute frameworks like Spark or MapReduce. This scheduler is different than the ones we discussed previously, as it’s emphasizing on a data-aware scheduling which we’ll cover in this post.

Background

In today’s batch computing frameworks like Hadoop and Spark, they run a number of stages and tasks for each job which builds into a DAG (directed acyclic graph) dependency graph. If we assume a large portion of these jobs are I/O intensive, then a scheduler job will be to try to minimize the time it takes for tasks to read their data. However, in a large multi-tenant cluster, the perfect node with data locality can be often unavailable.

Data applications and algorithms today are also having the option to only choose a subset of source data for approximating the answer instead of requiring the full set of data.

Spark & MapReduce frameworks typically has input tasks that reads source data and intermediate tasks that has data forwarded from the input tasks to further processing. For a task scheduler, what it can optimize for input tasks is to try to place tasks closer to the source data (locality). For intermediate tasks, the scheduler instead will optimize for minimizing the network transfer from the input tasks. One of the main bottlenecks for in-cluster network bandwidth is over-saturated cross rack links. The authors simulated if network contention and data locality is achieved using past Facebook traces and estimated a 87.6% performance increase.

KMN Scheduler

The KMN scheduler is implemented in Spark that provides an application interface that allows users to choose what ratio of input data that the query will be selecting (1-100%).

What the KMN scheduler will do is based on all the available N inputs and locality choices, choose to launch input tasks (one-to-one transfers) on a random sample of K available blocks with memory locality.

For intermediate tasks that does many-to-one transfers, the main insight that the authors found is that the key to avoid skews in cross rack network bandwidth is to allow more than K inputs tasks to be launched (M tasks), since this allows more choices to transfer data from in the downstream tasks that can avoid skewing. While finding the optimal rack placement for tasks is a NP-hard problem, the authors suggested either using greedy search that works best for small jobs or a variant of round-robin for larger jobs works quite well in their setup.

One important decision here is certainly how many additional tasks should we launch. Too many more tasks will cause longer job wait time (also taking in account stragglers), but too little additional tasks can potentially cause network imbalance problems. Finding the balance allows you maximize the balance between the two. One strategy here is that the scheduler can decide how long it’s going to wait for upstream tasks to launch and complete before firing the downstream tasks, so when you do encounter stragglers you won’t be waiting for all of them to complete in your sample.

Thoughts

Cross rack network congestion is still a real problem when I chatted with several companies operating large on-prem clusters. While the importance of data locality is decreasing over time given the faster speed available in the cloud, I think cross-AZ and also network congestion is still a problem that I see companies often run into in the cloud.

Certainly can see all distributed data frameworks start to be more aware of the cluster resource bottleneck while making tasks and distribution decisions.

Categories: FLOSS Project Planets

Bryan Pendleton: The Weight of Ink: a very short review

Planet Apache - Thu, 2019-03-14 00:11

Who can resist a love story?

Who can resist a love story, set in a library?

It's two great tastes, that taste great together: Rachel Kadish's The Weight of Ink.

Well, I'm not really being fair. It's not set in a library, it's set (partially) in a Rare Manuscripts Conservation Laboratory.

In a library.

Well, it's also set in a kibbutz in Israel.

Oh, and it's also set in 17th century London, during the time of the Inquisition, and the Plague, and yet also, the time of the birth of modern Philosophy.

It's a book about Baruch Spinoza, who you might never have spent much time thinking about (certainly I never did), and it's a book about being Jewish in England during a time when that was only barely legal.

And it's DEFINITELY a love story.

But it's rather a non-traditional love story, not least because a lot of it is about People Who Love Books, both now and then, back in the days when a book was still a thing that People Who Love Books built by hand, with agonizing care.

Our heroes and heroines are the sort of people who know immediately what a rare thing it is to find a 350 year old book, or even writing of any sort:

Her eyes were on the book. "Iron gall ink," she said after a moment.

Following her gaze, he understood that the damage had been done before he ever touched the ledger. The pages were like Swiss Cheese. Letters and words excised at random, holes eaten through the page over the centuries by the ink itself.

And they are the sort of people who can survive the most horrible tortures and injuries, and yet the thing that pains them the most is the loss of books:

Before she knew what she was saying, she turned to the rabbi. "What do you see," she said, "behind the lids of your eyes?"

For the first time there was unease beneath his silence. She felt a hard, thin satisfaction she was ashamed of.

"I shall not, at this moment, answer this question," he said. "But I will tell you what I learned after I lost my sight, in the first days as I came to understand how much of the world was now banned from me -- for my hands would never again turn the pages of a book, nor be stained with the sweet, grave weight of ink, a thing I had loved since first memory. I walked through rooms that had once been familiar, my arms outstretched, and was fouled and thwarted by every obstacle in my path. What I learned then, Ester, is a thing that I have been learning ever since."

The literary technique of trying to tell two stories, one old and set in the past, and one new and set in the current time, is well-known, and although it can be powerful, it can also be a bit of a crutch.

It also leads to a situation in which the book is packed full of characters, and can be a tad confusing when you jump back and forth, although I felt like, overall, The Weight of Ink pulled this off well, and did not over-burden the reader.

Some of the characters are extraordinarily compelling, and front and center is surely Ester, the 17th-century orphan girl who comes to live in the household of a blind, dying rabbi.

Other characters are, well, not quite so gripping, such as the young heiress Mary, or the extremely annoying graduate student Aaron.

But for my money, my favorite was the aging scholar Helen, absorbed in the study of history, dragging herself out of bed every day, overcoming her advanced Parkinson's disease, to get into the library and spend her time with The Books:

For a long time, Helen sat in the silent laboratory. All around her, on shelves and tables, on metal trays and in glass chambers, lay a silent company of paper: centuries old, leaf after leaf, torn or faded or brittle. Pages inked by long-dead hands. Pages damaged by time and worse. But they -- the pages -- would live again.

The climactic scene in which Helen must go to face the Dean, who waits for her to deliver her requested resignation, is remarkably more vivid and compelling and heart-wrenching than you could possibly imagine.

A large part of The Weight of Ink is the painstaking detective story of the literary historians, discovering The Books, poring over their contents, and then, slowly, but surely, reading between the lines to understand what they really say.

But The Weight of Ink succeeded, for me, because it balances that detective story quite nicely with the fill-in-the-blanks story of Ester and her adventures in London.

Bit by bit, page by page, Aaron and Helen come to understand what Ester's life was like, and what she did and thought and felt.

And yet, how could they? How could any of us know what it was like to be a young girl, alone in a city of tragedies at a time of horrors, still consumed by those most elemental of human passions:

"No," she said. "No, it's not that way. I choose with my heart, and my heart is for you." As she said it she felt her heart insisting within her ribs -- indeed, for the first time in her life she almost could see her heart, and to her astonishment it seemed a brave and hopeful thing: a small wooden cup of some golden liquid, brimming until it spilled over all -- the rabbi breathing in his bed, the dim candlelight by which Ester had so long strained at words on the page, the dead girl with her father in the cart. All that was beautiful and all that was precious, all of it streaming with sudden purpose here -- to this place where they now stood.

And, of course, in and around it all, there is that Birth of Modern Philosophy business, with plenty of Hobbes and Descartes and Spinoza.

And whether that's your thing, or not, probably depends a lot on how you feel about Philosophy.

The folly of her own words astonished her. She pulled the papers back from over the water, and read more, and as she read she saw the enormity of her blindness. In her arrogance and loneliness she'd thought she understood the world -- yet its very essence had been missing from her own philosophy.

The imperative -- she whispered it to herself -- to live. The universe was ruled by a force, and the force was life, and live, and live -- a pulsing, commanding law of its own. The comet making its fiery passage across their sky didn't signify divine displeasure, nor did it have anything to say of London's sin; the comet's light existed for the mere purpose of shining. It hurtled because the cosmos demanded it to hurtle. Just as the grass grew in order to grow. Just as the disfigured woman must defy Bescos, who'd consider her unfit for love; just as Ester herself had once, long ago, written because she had to write.

But I suspect that most People Who Love Books are also people who are quite interested in Philosophy, so I suspect that it's actually a pretty fair bet that if you want to read a love story (actually four or five different love stories, as it turns out) set in a library (yes, yes, I know, a Rare Manuscripts Conservation Laboratory), then you probably want to read a fair amount about the early days of Rationalism and its conflicts with the major Religious Philosophies of the day.

Or maybe you just want to read a great love story!

Oh, to heck with it: go read The Weight of Ink. It's well worth your time.

Categories: FLOSS Project Planets

Promet Source: Top 12 Accessibility Issues

Planet Drupal - Wed, 2019-03-13 20:41
Are you concerned about web accessibility issues that might be hidden within your pages? We recently gathered input from the Promet accessibility team concerning digital accessibility issues that are most often in need of remediation, and we came up with a Top 12 List of web accessibility mistakes and oversights. They pertain to:  
Categories: FLOSS Project Planets

Pages