FLOSS Research

Growing The Community

Open Source Initiative - Tue, 2017-06-20 23:01

How can you grow an open source community? Two blog posts from The Document Foundation (TDF) illustrate a proven double-ended strategy to sustain an existing community.

Since it was established in 2010, the LibreOffice project has steadily grown under the guidance of The Document Foundation (TDF) where I’ve been a volunteer — most lately as a member of its Board. Starting from a complex political situation with a legacy codebase suffering extensive technical debt, TDF has been able to cultivate both individual contributors and company-sponsored contributors and move beyond the issues to stability and effectiveness.

What made the project heal and grow? I’d say the most important factor was keeping the roles of individual and company contributions and interests in balance, as these two posts illustrate.

Individual Contributors

The first priority for growth in any open source project is to grow the individual contributor base. TDF has used a variety of techniques to make this possible, many embodied in first of the two illustrative posts, announcing another Contributor Month declared for May 2017 which asked people to:

  • Help to confirm bug reports
  • Contribute code
  • Translate the user interface
  • Write documentation
  • Answer questions from users at ask.libreoffice.org
  • Spread the word about LibreOffice on Twitter

As part of the event the co-ordinator promised to send specially-designed stickers to all contributors, providing a positive reinforcement to encourage people to join in and stay around.

Equally importantly TDF provides introductions to many contribution types.  There is a well-established getting started guide for developer contributors, including the well-established “Easy Hacks” list of newbie-friendly things that need doing. There’s also a good introductory portal for translators/localizers.

All of this has been undergirded by sound technical choices that make joining such a complex project feasible:

  • Reimplementing the build system so you don’t have to be a genius to use it
  • Extensive use of automated testing to virtually eliminate uncalled code, pointer errors and other mechanically-identifiable defects
  • An automated build system that stores binaries to allow easy regression testing
  • Translation of comments into English, the project’s language-of-choice (being the most common second language in the technical community)
  • Refactoring key code to use modern UI environments
Commercial Opportunity

Secondly, TDF has operated in a way that encourages the emergence of a commercial ecosystem around the code. That in turn has led to the continued employment of a substantial core developer force, even in the face of corporate restructuring.

LibreOffice has a very large user base — in the double-digit millions range worldwide — and many of those visiting the download page make financial donations to help the project. That’s a wonderful thing for a project to have, but it turns out that having money is not necessarily an easy blessing. It needs spending, but that has to be in a way that does not poison the contributor community.

Rather than hiring staff to work on the software, the project identifies areas where unpaid volunteers are not likely to show up and its Engineering Steering Committee proposes a specification. The Board of Directors then creates a competitive tender so that individuals or companies can earn a living (and gain skills) making LibreOffice better while still retaining the collaborative spirit of the project. Doing this builds commercial capability and thus grows the overall community.

So the second post is a sample tender, this one to do some unglamorous refactoring work to update the SVG handling. TDF is likely to receive a number of costed proposals and the Board will make a decision how to award the work based on the greatest community benefit — a combination of optimum cost, involvement of community members, nature of the bidder and more. The result is a community contributor fairly compensated without creating a permanent TDF staff role.

Balance Yielding Benefit

As a result of this and other carefully-designed policies, the LibreOffice project now has a developer force in the hundreds, companies whose benefits from LibreOffice lead to them making significant contributions, a reworked and effective developer environment with automated testing to complement the test team, and one of the most reliable release schedules in the industry.  This means new features get implemented, bugs get fixed and most importantly security issues are rapidly resolved when identified.

This is what OpenOffice.org became when it left the commercial world. The early controversies have passed and the project has a rhythm that’s mature and effective. I think it’s actually better now than it was at Sun.

(This article originally appeared on "Meshed Networks" and was made possible by Patreon patrons.
Image Credit, Simon Phipps. 2017. CC-BY-ND.

Categories: FLOSS Research

OW2 Consortium: Building Beyond Europe

Open Source Initiative - Thu, 2017-06-15 08:12

This year marks the 10th anniversary of OW2, and the organization is celebrating during its annual conference, on June 26-27, in Paris, France. OSI GM Patrick Masson sat down with Cedric Thomas, CEO of OW2 to learn more about the foundation, it’s accomplishments over the past 10 years, and what’s in store for the anniversary celebration.

The Open Source Initiative (OSI) Affiliate Membership Program is an international who’s who of open source projects, advocates, and communities: Creative Commons, Drupal Association, Linux Foundation, Mozilla Foundation, Open Source Matters (the foundation supporting Joomla), Python Software Foundation, Wikimedia Foundation, Wordpress Foundation and many more. Open source enthusiasts outside Europe may not be as familiar with another OSI Affiliate Member, OW2, however its impact on open source development and adoption across the EU has been significant.

“...open source is not only a great way to develop great software collaboratively,
but also a new way of doing business in the software industry.”

Patrick Masson: Can you tell me a little bit about OW2 for those who may not be familiar with the organization?

Cedric Thomas: OW2 is a non-profit, founded in 2007, dedicated to the development of enterprise-level open source software. We are very proud to be celebrating our tenth anniversary this year. OW2 inherited a repository of open source middleware solutions called ObjectWeb, that was supported by Inria], the French Institute for Research in Computer Science and Automation, and Bull ATOS Technologies. We launched OW2 with the vision that open source is not only a great way to develop great software collaboratively, but also a way of doing business in the software industry.

OW2's technical focus is on software that is more related to back-end technical excellence than to customer-facing competitive advantage. It's the kind of software that can be developed collaboratively by different companies even if they are competitors. We focus on infrastructure software for distributed business processes. Our scope encompasses middleware and application platforms for the management of business processes (BPM), business intelligence (BI), big data, content (CMS), identity and access, collaboration (wiki, chat) and cloud computing. We also welcome projects that help develop and manage these distributed systems. Our code base also has tools and frameworks for development, testing and workload simulation.

Masson: Is it fair to say OW2 is primarily working with organizations in Europe? What are some of the benefits you've found working across the EU?

Thomas: OW2 was founded by members from all over the world: the US, Brazil, China and Europe— we are a global organization. We have no physical offices, we live on the web, so to speak, and our team has been global from the beginning. We have strong ties with open source communities in Brazil and China for example. But you are right, after ten years it is fair to say that, although our strategy is global, we have the most traction in Europe, with almost 2/3 of our new individual members last year (in 2016) coming from Europe. It is a big market with a keen interest in open source. OW2 is easily accessible for SMEs and collaborative projects, and we are regularly invited to take part in publicly financed R&D projects with which we have real experience. For many, we have become the de facto EU-driven open source organization.

“...we live on the web, so to speak, and our team has been global from the onset...
we have become the de facto EU-driven open source organization.”

Masson: What about challenges?

Thomas: There are lots of very good open source developers in Europe, but there are very few software industry leaders. That's clearly our main challenge. It's a geography of good open source contributors and clever consumers, but the software industry leadership is not in Europe. That has an influence on both software vendors and users. Many European vendors may develop world-class technologies while being, in fact, followers from a business point of view. On the user side, many decision makers are conservative, they seek some sort of protection in proprietary software. Another challenge is that Europe remains a fragmented market, as most open source SMEs concentrate on their home market, deriving perhaps only 20% of their revenue from other countries. It's a question of culture, of professional networks and, of course, language. And because of that fragmentation, these SMEs find it difficult to reach a critical mass that could make them industry leaders.

Masson: What types of activities is OW2 working on that expand beyond Europe?

Thomas: As of today, “Beyond Europe” for OW2 really means Brazil where we support a fledgling local chapter, and China where our members run a local chapter and organize an annual programming contest for students. I know there is significant potential for OW2 in Africa, the Middle-East and Russia, but we currently do not yet have the bandwidth to properly look after these markets. We have a significant number of visitors and downloads from the US, and this is the reason why we think it is important for OW2 to take part in events that are both local and global such a OSCON and the OpenStack Summit.

Masson: In the 10 years of OW2's work, what do you think have been the greatest challenges for open source software and communities, not just OW2, but the open source movement?

Thomas: Well, I think the past ten years have been rather good for open source. It’s now recognized as the primary mechanism for collaborative innovation. In addition, the proliferation of non-profit organizations, or foundations, dedicated to projects or technologies, is an excellent illustration of the fantastic momentum open source has seen in recent years. It doesn't mean there haven’t been challenges. As I said, we see open source as a way to do business, but that doesn't mean you can do whatever you want. From that perspective I see two principal challenges. On one hand, there is the issue of defending the integrity of the open source model against all sorts of opportunists who take advantage of it with business strategies based on open core licensing and user lock-in tactics. On the other hand, there is making open source more professional and economically sustainable in a context characterized by a flurry of open source projects, many of which are amateurish and short-lived.

"the Technology Council is also careful not to accept what we call 'crippleware'
...software that is useless unless complemented by additional proprietary software."

Masson: How has OW2 worked to address those challenges?

Thomas: We have addressed these issues through our governance, first by how we accept projects in our code base, and second, how we help them improve quality. Firstly, our Technology Council has always been very careful with the projects accepted in the OW2 code base. Right from the beginning we have ruled out purpose-made or vanity licenses. Our projects must publish their software under a license endorsed by the Open Source Initiative. Period. But the Technology Council is also careful not to accept what we call “crippleware” i.e. software that is useless unless complemented by additional proprietary software. Secondly, five years ago we launched a quality program aimed at helping project leaders better manage their projects and building trust among users for software in the OW2 code base. Last year we launched the second generation of our quality program: the Open Source Capability Assessment Radar (OSCAR).

"OSS has to prove its worth to conventional business managers
who are far from free software advocates."

Masson: What do you think are the most exciting opportunities for the future of open source software, and how will OW2 play a part.

Thomas: Open source software is going mainstream, and this brings a whole range of new challenges to the open source ecosystem. Even OW2's market sweet spot may be shifting. As I said, OW2 was born out of enterprise information system infrastructure where software is not dependent on a company's vertical sector – a CMS or an ESB works practically the same way whether it be a bank or a hospital – and where sharing and re-use across all sectors brings critical mass and maximizes economies of scale. But, as it becomes increasingly mainstream, OSS expands into new territories where its advantages are not quite so obvious. For example, in going mainstream OSS has to prove its worth to conventional business managers who are far from free software advocates. We are currently leveraging the experience gained through our quality program by developing a transposition of NASA's Technology Readiness Levels in the business world of open source. We will introduce the OW2's Market Readiness Index at OW2con, our annual conference, in June. It will bring value to our projects and stability and predictability for open source decision makers. I see another opportunity in the IoT world: it is still pretty much structured in silos, where software is dependent on usage specific hardware and standards and where developments across different usage are not easily shared. As it matures, I expect the IoT world to become a market for horizontal platforms. This will open tremendous opportunities for an experienced organization such as OW2.

Categories: FLOSS Research

OSI Joins Internet-wide Day of Action to Protect Net Neutrality

Open Source Initiative - Mon, 2017-06-12 17:57

On July 12, 2017, websites, Internet users, online communities an the Open Source Initiative will come together to sound the alarm about the FCC’s attack on net neutrality. Learn how you can join the protest and spread the word at https://www.battleforthenet.com/july12.

Right now, new FCC Chairman and former Verizon lawyer Ajit Pai has a plan to destroy net neutrality and give big cable companies immense control over what we see and do online. If they get their way, the FCC will give companies like Comcast, Verizon, and AT&T control over what we can see and do on the Internet, with the power to slow down or block websites and charge apps and sites extra fees to reach an audience.

If we lose net neutrality, we could soon face an Internet where some of your favorite websites are forced into a slow lane online, while deep-pocketed companies who can afford expensive new “prioritization” fees have special fast lane access to Internet users – tilting the playing field in their favor.

But on July 12th, the Internet will come together to stop them. Websites, Internet users, and online communities will stand tall, and sound the alarm about the FCC’s attack on net neutrality.

The Battle for the Net campaign will provide tools for everyone to make it super easy for your friends, family, followers to take action. From the SOPA blackout to the Internet Slowdown, we've shown time and time again that when the Internet comes together, we can stop censorship and corruption. Now, we have to do it again!

Learn more and join the action here: https://www.battleforthenet.com/july12

Categories: FLOSS Research

Proprietary Election Systems: Summarily Disqualified

Open Source Initiative - Fri, 2017-04-28 17:06

The following was provided by Brent Turner, California Association of Voting Officials (CAVO) Secretary. CAVO, an OSI Affiliate Member.

Hello Open Source Software Community & U.S. Voters,

I and the California Association of Voting Officials, represent a group of renowned computer scientists that have pioneered open source election systems, including, "one4all," New Hampshire’s Open Source Accessible Voting System (see attached). Today government organizations like NASA, the Department of Defense, and the U.S. Air Force rely on open source software for mission critical operations. I and CAVO believe voting and elections are indeed mission-critical to protect democracy and fulfill the promise of the United States of America as a representative republic.

Since 2004, the open source community has advocated for transparent and secure—publicly owned—election systems to replace the insecure, proprietary systems most often deployed within communities. Open source options for elections systems can reduce the costs to taxpayers by as much as 50% compared to traditional proprietary options, which also eliminates vendor lock-in, or the inability of an elections office to migrate away from a solution as costs rise or quality decreases.

Like others, CAVO is opposed to AB-668 Voting Modernization Bond Act of 2018. Although this Act might help our efforts toward better, open source, election systems, the bond funds it creates—450 million dollars—can be used by California localities to procure insecure, over-priced, proprietary systems, which will only lock these communities into additional commitments (and more costs). This Act will set horrible precedent for the United States, only exacerbating the growing national security and confidence crisis surrounding elections and their outcomes.

CAVO finds it incomprehensible that an elected leader in the United States would push for funding of these systems deemed insecure by government studies. Their proprietary nature disqualifies them summarily.

Both I and CAVO are available for further information.

Best regards,
Brent Turner
Secretary, California Association of Voting Officials

Categories: FLOSS Research

React to React

Open Source Initiative - Thu, 2017-04-27 07:36

The OSI has received several inquiries concerning its opinion on the licensing of React [1], which is essentially the 3-clause BSD license along with, in a separate file, an 'Additional Grant of Patent Rights' [2].

The Additional Grant of Patent Rights is a patent license grant that includes certain termination criteria. These termination criteria are not entirely unprecedented when you look at the history of patent license provisions in OSI-approved licenses, but they are certainly broader than the termination criteria [or the equivalent] in several familiar modern licenses (the Apache License 2.0, EPL, MPL 2.0, and GPLv3).

The 'Additional Grant' has attracted a fair amount of criticism (as did an earlier version which apparently resulted in some revisions by Facebook). There was a recent blog post by Robert Pierce of El Camino Legal [3] (which among other things argues that the React patent license is not open source). Luis Villa wrote an interesting response [4].

What do members of the license-discuss community think about the licensing of React? OSI Board Director Richard Fontana posed a few issues to the OSI's License Discuss forum (you can see the thread here):

  • does the breadth of the React patent termination criteria raise OSD-conformance issues or otherwise indicate that React should not be considered open source?
  • is it good practice, and does it affect the open source status of software, to supplement OSI-approved licenses with separate patent license grants or nonasserts? (This has been done by some other companies without significant controversy.)
  • if the React patent license should be seen as not legitimate from an OSI/OSD perspective, what about the substantial number of past-approved (if now mostly obsolete) licenses that incorporated patent license grants with comparably broad termination criteria?
  • should Facebook be encouraged to seek OSI approval for the React license including the patent license grant?

Please note, at the time of this post and discussion, Facebook was a Corporate Sponsor of the Open Source Initiative.

[1] https://facebook.github.io/react/ [2] https://github.com/facebook/react/blob/master/PATENTS [3] http://www.elcaminolegal.com/single-post/2016/10/04/Facebook-Reactjs-License [4] http://lu.is/blog/2016/10/31/reacts-license-necessary-and-open/
Categories: FLOSS Research

Volunteering at the 2017 SFBay ACT-W conference

Open Source Initiative - Mon, 2017-04-17 10:48

Special guest article by Sean Roberts (@sarob). The OSI would like to thank Sean, Erich and Zack for all their help in promoting and staffing the OSI booth during the Bay Area ACT-W, and their support of open source software and women in technology. The OSI frequently reaches out to our membership and local open source software advocates to help organize and staff local events—if you're interested in volunteering too, please contact us.

What Was This

I had the privilege of volunteering for the Open Source Initiative (OSI) table at the ACT-W conference at Galvanize, San Francisco this last Saturday with Erich Clauer and Zachariah Sherzad. It was an event focused on giving women the best information on advancing in technical careers. Keynotes and talks sounded excellent on paper, but I missed out on them, as I was in the career fair part of the event for the day. There were many volunteering tables set up in the career area. OSI was one of them. Pyladies, Chicktech, Docusign, among others were there to support technical women. I answered questions about OSI and open source. There was a mix of experience levels, but most were just starting their technical careers.

What Was Happening

Everyone was networking, which is always a good practice. The energy was positive and welcoming. I was able to talk with many women on what OSI stands for and what I could help them with. Describing the benefits of open source is very enjoyable. Especially relevant there were a lot of opportunities to engage people directly. I felt my mentoring on how to join public open source teams was good and valuable information. However, I wasn’t helping with the most important need of all, a paying job.

How To Improve

I should have had a list of open source positions from all the OSI affiliates. Through the ACT-W organizers, I should have asked for digital resumes and LinkedIn URLs before the event started. As people came forward, I could have looked up their resume and matched them with possible openings. Then I could counsel them on their experience and plans.

So, I am going start working with OSI to figure out how we can add these and additional services.


Originally posted at sarob, April 14th, 2017, licensed under a Creative Commons Attribution 4.0 International License.

Categories: FLOSS Research

Software Freedom Advocates Invited to the Open Source Initiative's Community Social

Open Source Initiative - Sat, 2017-04-01 13:41

OSI hosting a community social for the open source software community: join us for drinks, dinning and discussions.

Meadhall, Cambridge, MA April 04, 2017 -- If you're in the greater Boston, Massachusetts area on Tuesday, April 4th, the OSI Board of Directors invites you to join us and the free and open source software community for a casual evening get-together (Oh, and please feel free to bring a friend).

If you're interested in extending your open source activities, the following day the OSI will be hosting, a symposium on Open Source Software, Open Science, Open Source Hardware, and their role in innovation: "Open Source for Science + Innovation." Details can be found here.

Details for the community social...

Tuesday, April 4th, 2017
8:00 PM to ???

4 Cambridge Center
Cambridge MA 02142

Categories: FLOSS Research

Open Source for Science + Innovation

Open Source Initiative - Wed, 2017-03-29 11:40

A symposium on Open Source Software, Open Science, Open Source Hardware, and their role in innovation


Who is this symposium for?
This symposium is for anybody with an interest in open source software, open science, open hardware, and innovation.

What is this symposium about?
We are bringing together open source and open science specialists to talk about the “how and why” of open source and open science. Members of these communities will give brief talks which are followed by open and lively discussions open to the audience. Talks will highlight the role of openness in stimulating innovation but may also touch upon how openness appears to some to conflict with intellectual property interests.

Date/Time: April 5th 2017, 11:30-17:00 pm

Venue provided by CIC Boston,
Lighthouse room
50 Milk Street, 20th floor
Boston, MA 02109 [view map]


      11:30-12:00 Opening session
      12:00-13:00 Session 1 (3 talks)
      13:00-13:45 Session 2 (3 talks)
      13:45-14:30 Break and networking
      14:30-15:30 Session 3 (3 talks)
      15:30-16:30 Session 4 (3 talks)
      16:30-17:00 Summary and open discussions

Contact: Kevin Moerman, MIT Media Lab



Phoebe Ayers
MIT Librarian for Electrical Engineering & Computer Science and Research Data Management
Steering Committee member of the Engineering Archive (EngrXiv)
Topic: Managing and sharing your data and code for openness

Kipp Bradford
Research Scientist at the MIT Media Lab
BeagleBoard.org board member
Nation of Makers board member
Topic: "Making Patents Work: Invention, Innovation, and the need for Open"

Molly de Blanc
Director at The Open Source Initiative (OSI)

Andrey Fedorov
Assistant Professor in Radiology at the Surgical Planning Laboratory (SPL), Department of Radiology, Brigham and Women's Hospital and Harvard Medical School
Ambassador for the Centre for Open Science (COS)

Michael Halle
Instructor of Radiology
Harvard Medical School
Director of Technology Development
Director of Visualization
Surgical Planning Lab
Brigham and Women's Hospital

Patrick Masson
GM & Director at The Open Source Initiative (OSI)
Adjunct Professor, University at Albany, College of Engineering and Applied Sciences
“Teaching Open Source: Creating an online course based on Karl Fogel’s text, ‘Producing Open Source Software’”

Kevin Moerman
MIT Media Lab, Biomechatronics
Co-founder and editor for the Journal of Open Source Software (JOSS)
Steering Committee member of the Engineering Archive (EngrXiv)
Developer for the GIBBON open source project


Jessica Polka
Director, ASAPbio, & Visiting Scholar, Whitehead Institute
ASAPbio: a biologist-driven project to promote the use of preprints in the life sciences

Allison Randal
President at the Open Source Initiative (OSI)
Topic: Capabilities for Open Innovation

Travis Rich
MIT Media Lab, Viral Communications Group
Topic: PubPub: a free and open tool for collaborative editing, instant publishing, continuous review, and grassroots journals.

Carol Smith
Director at The Open Source Initiative (OSI)
Topic: Project management in open source communities

Categories: FLOSS Research

OSI Welcomes the Journal of Open Source Software as Affiliate Member

Open Source Initiative - Mon, 2017-03-27 09:24

Open Source Initiative Extends Support for Computational Science and Engineering Research and Researchers.

PALO ALTO, Calif. - March 28, 2017 -- The Open Source Initiative® (OSI), a global non-profit organization formed to educate about and advocate for the benefits of open source software and communities, announced that the Journal Of Open Source Software (JOSS), a peer-reviewed journal for open source research software packages, is now an OSI affiliate member.

JOSS is a response to the growing demand among academics to directly publish papers on research software. Typically, academic journals and papers related to software focus on lengthy descriptions of software (features and functionality), or original research results generated using the software. JOSS provides a means for researchers to directly publish their open source software as source code: the complete set of software with build, installation, test, and usage instructions.

The primary purpose of a JOSS paper is to enable authors to receive citation credit for research software, while requiring minimal new writing. Within academia, it is critical for researchers to be able to measure the impact of their work. While there are many tools and metrics for tracking research outputs, JOSS fills the gap for software source code, which doesn't look like traditional academic research papers, allowing it to be treated as another form of research output.

JOSS has a rigorous peer-review process designed to improve the quality of the software product, and a first-class editorial board experienced at building (and reviewing) high-quality research software. Review criteria assess the research statement of need, installation and build instructions, user documentation and contributing guidelines. The software is also required to have an OSI-approved license and a code of conduct. Authors must include examples of use, tests, and suitable API documentation. Reviewers and authors participate in an open and constructive review process focused on improving the quality of the software—and this interaction itself is subject to the JOSS Code of Conduct.

OSI Board Director Stefano Zacchiroli noted, "JOSS is a clever hack. It addresses the idiosyncrasy of traditional academic publishing that still forces researchers to write bogus papers in order to get credit for the impactful research software they write. By requiring that published software be released under an OSI-approved license, and that a third party archive the software and associate the archive with a DOI, JOSS ensures that published research software enters the software commons and that it be always available for anyone to use, modify, and share."

"On the face of it, writing papers about software is a strange thing to do, especially if there's a public software repository, documentation and perhaps even a website for users of the software. But writing a paper is currently the most recognized method for academics to gain career credit, as it creates a citable entity that can be referenced by other authors." said Arfon Smith, JOSS Editor-in-Chief. "The papers JOSS publishes are conventional papers, other than their short length: the journal is registered with the Library of Congress and has an ISSN. Every JOSS paper is automatically assigned a Crossref DOI and is associated with the ORCID profiles of the authors. If software papers are currently the best solution for gaining career credit for software, then shouldn't we make it as easy as possible to create a software paper?"

In recent years, the OSI has made significant investments in higher education: extending the OSI Affiliate Member Program to institutions of higher education, creating educational materials, sponsoring curriculum development, and even developing a complete online course. Supporting academic research enabled by openly licensed software is a natural progression of the OSI's work in higher education.

The OSI Affiliate Member Program is available at no-cost to non-profit or educational institutions and government agencies—independent groups with a commitment to open source—that support OSI's mission to raise awareness and adoption of open source software and to build bridges among different constituencies in the open source community.

About The Journal Of Open Source Software

Founded in 2016, the Journal of Open Source Software (JOSS) is a developer-friendly peer-reviewed academic journal for research software packages, designed to improve the quality of the submitted software and to make software citable as a research product. JOSS is an open-access journal committed to running at minimal cost, with zero publication fees or subscription fees. With volunteer effort from the editorial board and community reviewers, donations, and minimal infrastructure cost, JOSS can remain a free community service. Learn more about JOSS at: http://joss.theoj.org.

About the Open Source Initiative

Founded in 1998, the Open Source Initiative protects and promotes open source by providing a foundation for community success. It champions open source in society through education, infrastructure and collaboration. The (OSI) is a California public benefit corporation, with 501(c)(3) tax-exempt status. For more information about the OSI, or to learn how to become an affiliate, please visit: http://opensource.org.

Media Contact
Ed Schauweker

Categories: FLOSS Research

San Francisco Open Source Voting System Project Continues On

Open Source Initiative - Thu, 2017-03-23 15:13

This update on San Francisco's project to develop and certify the country's first open source voting system was submitted by OSI Individual Member Chris Jerdonek. While Chris is a member (and President) of the San Francisco Elections Commission, he is providing this update as an individual and not in his official capacity as a Commissioner. Chirs' e-mail is, chris@sfopenvoting.org and a website with that domain is expected soon.

Some "Action Items"

Below are some things you can do to help the effort:

  • Show support by following @SFOpenVoting on Twitter: https://twitter.com/SFOpenVoting
  • Retweet the following tweet about a recent front-page news story to help spread the word: https://twitter.com/SFOpenVoting/status/834136200663310336
  • Reply to Chris with the words "keep me posted" if you'd like me to notify you sooner if something interesting happens related to the project (e.g. an RFP or job posting getting posted, an important Commission meeting coming up, the publishing of a news piece about SF open source voting, etc).
  • Reply to Chris with the words "might want to help" if you might like to help organize or be part of a core group of activists to help build more support and otherwise help the project succeed. This would likely start off with a small organizing meeting.
  • Show your support and come watch the next Elections Commission meeting on Wed, April 19 at 6pm in Room 408 of San Francisco City Hall: http://sfgov.org/electionscommission/
San Francisco Examiner Article

At the February 15 Elections Commission meeting, the Elections Commission voted unanimously to ask the Mayor's Office to allocate $4 million towards initial development of the open source voting project for the 2018-19 fiscal year (from Aug. 2018 - July 2019). This would go towards initial development once the planning phase is complete.

The San Francisco Examiner wrote a good article about this development here (with the headline appearing on the front page): http://www.sfexaminer.com/sfs-elections-commission-asks-mayor-put-4m-toward-open-source-voting-system/

Latest Project Updates

The open source voting project is starting to gain some definition. For the latest, read the March 2017 Director's Report in the agenda packet of last week's March 15 Elections Commission meeting: http://sfgov.org/electionscommission/commission-agenda-packet-march-15-2017

A few highlights from the report:

  • Very soon (perhaps within the next few days), the Department will be posting a job opening for a senior staff position to assist with the project.
  • In addition, by the end of March or so, the Department will be issuing an RFP for an outside contractor to help plan and create a "business case" for the project.
  • Software for the project will be released under version 3 of the GNU General Public License where possible. (GPL-3.0 is a copyleft license, which means that future changes would also be assured open source.)
  • The software is projected to be released to the public as it is written, which would be great for increased public visibility and transparency.
Citizen's Advisory Committee

Also at last week's Elections Commission meeting, the Commission started discussing the idea of forming a Citizen's Advisory Committee to help guide the open source voting project. This is an idea that was raised at the February meeting, as well as previously.

At the meeting, it was suggested that the committee help advise primarily on technical matters -- things like agile procurement approaches, project management, open source issues, and engineering / architecture issues.

The Commission will be taking this up again at its April meeting on April 19 (and possibly voting on it).

If you might be interested in serving on the committee, you are encouraged to listen to the discussion from last week's meeting to get an idea of what it might be like (starting at 11 mins, 18 sec): https://www.youtube.com/watch?v=k2-DX8UNqY0&t=11m18s

(And if you know someone who might be good to serve on such a committee, please forward them this info!)

FairVote California house party (recap)

In early March, Chris spoke about San Francisco's open source voting project at a house party organized by FairVote California. Thank you to FairVote California for having me!

If anyone would like me to speak to a group of people about the project, just shoot me an e-mail. It would only require 5 minutes or so of a group's time.

GET Summit (upcoming: May 17-18)

On May 17-18 in San Francisco, there will be a conference on open source and election technology issues called the GET Summit: https://www.getsummit.org

The conference is being organized by Startup Policy Lab (SPL) and has an amazing line-up of many speakers, including people like California Secretary of State Alex Padilla and former Federal Elections Commissioner (FEC) Ann Ravel. In September 2016, Alex Padilla said on television that "open source [voting] is the ultimate in transparency and accountability for all." And Ann Ravel made headlines last month with her very public resignation from the FEC. See the conference website for the latest speaker list.

So that's all folks! Please follow @SFOpenVoting on Twitter if you haven't yet, and thank you for all your continued interest and support!

We thank Chris Jerdonek for his work in raising awareness of San Francisco's efforts to develop an open source voting system and sharing these updates with us here at the OSI and the larger open source software community.

Categories: FLOSS Research

Bicho 0.9 is comming soon!

LibreSoft Planet - Thu, 2011-06-09 10:06

During last months we’ve been working to improve Bicho, one of our data mining tools. Bicho gets information from remote bug/issue tracking systems and store them in a relational database.



The next release of Bicho 0.9 will also include incremental support, which is something we’ve missed for flossmetrics and for standalone studies with a huge amount of bugs. We also expect that more backends will be created easily with the improved backend model created by Santi Dueñas. So far we support JIRA, Bugzilla and Sourceforge. For the first two ones we parse HTML + XML, for sourceforge all we have is HTML so we are more dependent from the layout (to minimize that problem we use BeautifulSoup). We plan to include at least backends for FusionForge and Mantis (which is partially written) during this year.

Bicho is being used currently in the ALERT project (still in the first months) where all the information offered by the bug/issue reports will be related to the information available in the source code repositories (using CVSAnaly) through semantic analysis. That relationship will allow us to help developers through recommendations and other more pro-active use cases. One of my favorites is to recommend a developer to fix a bug through the analysis of the stacktraces posted in a bug. In libre software projects all the information is available in the internet, the main problem (not a trival one) is that it is available in very different resources. Using bicho against the bts/its we can get the part of the code (function name, class and file) that probably contains the error and the version of the application. That information can be related to the one got from the source code repository with cvsanaly, in this case we would need to find out who is the developer that edit that part of the code more often. This and other uses cases are being defined in the ALERT project.

If you want to stay tunned to Bicho have a look at the project page at http://projects.libresoft.es/projects/bicho/wiki or the mailing list libresoft-tools-devel _at__ lists.morfeo-project.org


Categories: FLOSS Research

ARviewer, PhoneGap and Android

LibreSoft Planet - Thu, 2011-06-09 05:44
ARviewer is a FLOSS mobile augmented reality browser and editor that you can easily integrate in your own Android applications. This version has been developed using PhoneGap Framework. The browser part of ARviewer draws the label associated with an object of the reality using as parameters both its A-GPS position and its altitude. The system works both outdoors and indoors in this latest case with location provided by QR-codes. ARviewer labels can be shown through a traditional list based view or through an AR view a magic lens mobile augmented reality UI.    The next steps are: 
  • Testing this source code in IOS platform to check the real portability that phoneGap provide us.
  • We plan to add the “tagging mode” with phoneGap to allow tag new nodes/objetcs from the mobile. 
  Are very very similar the next images, right? We only have found a critical problem with the refresh of nodes in the WebView using PhoneGap. We will study and analyze this behavior.  

ARviewer PhoneGap


ARviewer Android (native)

  More info: http://www.libregeosocial.org/node/24  Source Code: http://git.libresoft.es/ARviewer-phoneGap/  Android Market: http://market.android.com/details?id=com.libresoft.arviewer.phonegap
Categories: FLOSS Research

Finding code clones between two libre software projects

LibreSoft Planet - Thu, 2011-05-12 09:05

Last month I’ve been working in the creation of a report with the aim of finding out code clones between two libre software projects. The method we used was basically the one that was detailed in the paper Code siblings: Technical and Legal Implications by German, D., Di Penta M., Gueheneuc Y. and Antoniol, G.

It is an interesting case and I’m pretty sure this kind of reports will be more and more interesting for entities that publish code using a libre software license. Imagine you are part of a big libre software project and your copyright and even money is there, it would be very useful to you knowing whether a project is using your code and respecting your copyright and the rights you gave to the users with the license. With the aim of identifying these scenarios we did in our study the following:

  • extraction of clones with CCFinderX
  • detection of license with Ninka
  • detection of the copyright with shell scripts

The CCFinderX tool used in the first phase gives you information about common parts of the code, it detects a common set of tokens (by default it is 50) between two files, this parameter should be changed depending on what it is being looked for. In the following example the second and third column contain information about the file and the common code. The syntax is (id of the file).(source file tokens) so the example shows that the file with id 1974 contains common code with files with id 11, 13 and 14.

clone_pairs {
19108 11.85-139 1974.70-124
19108 13.156-210 1974.70-124
19108 14.260-314 1974.70-124
12065 17.1239-1306 2033.118-185
12065 17.1239-1306 2033.185-252
12065 17.1239-1306 2033.252-319
12065 17.1239-1306 2141.319-386

In the report we did we only wanted to estimate the percent of code used from the “original” project in the derivative work, but there are some variables that are necessary to take into account. First, code clones can appear among the files of the same project (btw this is clear sign of needing refactorization). Second, different parts of a file can have clones in different files (a 1:n relationship) in both projects. The ideal solution would be to study file by file the relationship with others and to remove the repeated ones.

Once the relationship among files is created is the turn of the license and copyright detection. In this phase the method just compares the output of the two detectors and finally you get a matrix where it is possible to detect whether the copyright holders were respected and the license was correctly used.

Daniel German’s team found interesting things in their study of the FreeBSD and Linux kernels. They found GPL code in FreeBSD in the xfs file system. The trick to distribute this code under a BSD license is to distribute it disabled (is not compiled into FreeBSD) and let the user the election of compiling it or not. If a developer compiles the kernel with xfs support, the resulting kernel must be distributed under the terms of the GPLx licence.

Categories: FLOSS Research

OpenBSD 4.9 incorpora el sistema /etc/rc.d

LibreSoft Planet - Wed, 2011-05-04 17:23
Algo de historia  

Como cualquier administrador de sistemas Unix sabe, init es el primer proceso en ejecución tras la carga del kernel, y da inicio a los demonios ("servicios") estándar del sistema. En el Unix original de Bell Labs, el proceso init arrancaba los servicios de userland mediante un único script de shell denominado /etc/rc. La Distribución de Berkeley añadió años después otro script denominado /etc/rc.local para arrancar otros servicios. 

Esto funcionó así durante años, hasta que Unix se fue fragmentando y, con la aparición del software empaquetado de terceros, la versión System V del Unix de AT&T introdujo un nuevo esquema de directorios en /etc/rc.d/ que contenía scripts de arranque/parada de servicios, ordenados por orden de arranque, con una letra-clave delante del nombre de fichero (S- arrancar servicios y K- detener el servicio). Por ejemplo: S19mysql inicia [S] el servicio mysql. Estos scripts (situados en /etc/init.d) se distribuyeron en niveles de ejecución (runlevels, descritos en /etc/inittab), asociando los scripts con enlaces simbólicos en cada nivel de ejecución (/etc/rc0.d, rc1.d, rc2.d, etc.). Los niveles de ejecución en cada directorio representan la parada, el reinicio, arranque en monousuario o multiusuario, etc. Este esquema, conocido como "System V" (o "SysV"), es, por ejemplo, el que adoptaron las distribuciones de Linux (con algunas diferencias entre ellas en cuanto a la ubicación de subdirectorios y scripts). Tenía la ventaja de evitar el peligro de que cualquier error de sintaxis introducido por un paquete pudiera abortar la ejecución del único script y por tanto dejar el sistema en un estado inconsistente. A cambio, introdujo cierto grado de complejidad en la gestión y mantenimiento de scripts de inicio, directorios, enlaces simbólicos, etc. 

Otros sistemas de tipo Unix, como los BSD, mantuvieron el esquema tradicional y simple de Unix, con solo uno o dos únicos ficheros rc y sin niveles de ejecución[*], si bien fueron incorporando algunos otros aspectos del esquema SysV de inicialización de los servicios del sistema. Por ejemplo, NetBSD incluyó un sistema de inicio System V similar al de Linux, con scripts individuales para controlar servicios, pero sin runlevels. FreeBSD, a su vez, integró en 2002 el sistema rc.d de NetBSD y actualmente cuenta con decenas de demonios de inicio que funcionan de forma análoga a SysV: 

$ /etc/rc.d/sshd restart


OpenBSD incorpora /etc/rc.d


OpenBSD, sin embargo, no había adoptado hasta ahora el subsistema de scripts individuales para controlar los servicios, lo que a veces causaba cierto pánico, como si les faltase algo esencial, a quienes desde el mundo Linux (u otros Unices)

entraban por primera vez en contacto con este sistema (aunque luego la cosa tampoco era tan grave, es cuestión de hábitos). La actual versión OpenBSD 4.8, publicada en noviembre de 2010, todavía utiliza únicamente dos scripts de inicio (/etc/rc y /etc/rc.local). En OpenBSD 4.9, que se publicará el próximo 1 de mayo, se ha implementado por primera vez esta funcionalidad mediante el directorio /etc/rc.d

Como suele ser habitual en OpenBSD, no se implementa algo hasta que se está seguro que se gana algo y que hay un modo sencillo y fiable de utilizarlo para el usuario final. El mecanismo es análogo al de otros sistemas de tipo Unix, pero más sencillo y con algunas sutiles e importantes diferencias que vale la pena conocer. Veámoslo. 

Descripción del nuevo subsistema /etc/rc.d de OpenBSD  

En /etc/rc.conf (donde se incluye las variables de configuración para el script rc)  nos encontraremos una nueva variable denominada rc_scripts: 

# rc.d(8) daemons scripts # started in the specified order and stopped in reverse order rc_scripts=

Incluimos en esa variable (o mejor, como se recomienda siempre, en /etc/rc.conf.local, un fichero opcional que sobreescribe las variables de /etc/rc.conf) los demonios que deseamos arrancar de inicio, por orden de arranque:

rc_scripts="dbus_daemon mysql apache2 freshclam clamd cupsd"

Los scripts de inicio de servicios residirán, como suele ser habitual, en el directorio /etc/rc.d. Pero una diferencia clave es que, aunque los scripts estén ahí situados, no arrancará nada automáticamente que no esté listado en la variable rc_scripts, siguiendo el principio de OpenBSD de evitar presumir automatismos predeterminados. Cada script responderá a las siguientes acciones:

  • start    Arranca el servicio si no está ya corriendo.
  • stop     Detiene el servicio.
  • reload   Ordena al demonio que recargue su configuración.
  • restart  Ejecuta una parada del demonio (stop), y a continuación lo inicia (start).
  • check    Devuelve 0 si el demonio está corriendo o 1 en caso contrario. 

Actualmente, este sistema solo se usa para demonios instalados desde paquetes, no para el sistema base de OpenBSD. Por ejemplo, para gestionar los estados del servicio "foobar", que habremos antes instalado desde ports o paquetes, basta ejecutar:

/etc/rc.d/foobar reload /etc/rc.d/foobar restart /etc/rc.d/foobar check /etc/rc.d/foobar stop

La última orden ("stop") se invoca también en un reinicio (reboot) o parada (shutdown) desde /etc/rc.shutdown, en orden inverso al que aparece en la variable en rc_scripts, antes de que se ejecute la orden "stop/reboot" para todo el sistema. No es necesario preocuparse por el orden de ejecución o por el significado de S17 al comienzo del nombre de los scripts.

Otra ventaja de esta implementación es lo extraordinariamente sencillos que es escribir esos scripts, frente a otras implementaciones que precisan scripts de decenas o incluso cientos de líneas. En su forma más simple:

daemon="/usr/local/sbin/foobard" . /etc/rc.d/rc.subr rc_cmd $1

Un ejemplo algo más complejo:

#!/bin/sh # # $OpenBSD: specialtopics.html,v 1.15 2011/03/21 21:37:38 ajacoutot Exp $ daemon="${TRUEPREFIX}/sbin/munin-node" . /etc/rc.d/rc.subr pexp="perl: ${daemon}" rc_pre() { install -d -o _munin /var/run/munin } rc_cmd $1

Como puede observarse, el script típico solo necesita definir el demonio, incluir /etc/rc.d/rc.subr y opcionalmente definir una expresión regular diferente a la predeterminada para pasársela a pkill(1) y pueda encontrar el proceso deseado (la expresión por defecto es "${daemon} ${daemon_flags}").

El nuevo script debe colocarse en ${PKGDIR} con extensión .rc, por ejemplo foobard.rc. TRUEPREFIX se sustituirá automáticamente en el momento de instalarlo.

La sencillez y limpieza es posible gracias al subsistema rc.subr(8), un script que contiene las rutinas internas y la lógica más compleja para controlar los demonios. Así y todo, es muy legible y contiene menos de 100 líneas. Existe también una plantilla para los desarrolladores de paquetes y ports que se distribuye en "/usr/ports/infrastructure/templates/rc.template".

Y eso es todo. Cualquier "port" o paquete que necesite instalar un demonio puede beneficiarse ahora de los scripts rc.d(8). Quizá el nuevo sistema no cubra todos los supuestos, pero cubre las necesidades de los desarrolladores de ports para mantener un sistema estándar y sencillo para arrancar servicios). En marzo de 2011, ya hay más de 90 ports de los más usados que los han implementado. Por supuesto, el viejo sistema sigue funcionando en paquetes no convertidos, pero es indudable que los desarrolladores de OpenBSD (especial mención para Antoine Jacuotot (jacuotot@) y Robert Nagy (robert@)) han logrado una vez más un buen balance entre simplicidad y funcionalidad. Por supuesto, para ampliar detalles, nunca debe eludirse leer las páginas correspondientes del manual: rc.subr(8), rc.d(8), rc.conf(8) y rc.conf.local(8) y la documentación web


(*) Que BSD no implemente "/etc/inittab" o "telinit" no significa que no tenga niveles de ejecución (runlevels), simplemente es capaz de cambiar sus estados de inicio mediante otros procedimientos, sin necesidad de "/etc/inittab".

Categories: FLOSS Research

Brief study of the Android community

LibreSoft Planet - Mon, 2011-04-18 12:19

Libre software is changing the way applications are built by companies, while the traditional software development model does not pay attention to external contributions, libre software products developed by companies benefit from them. These external contributions are promoted creating communities around the project and will help the company to create a superior product with a lower cost than possible for traditional competitors. The company in exchange offers the product free to use under a libre software license.

Android is one of these products, it was created by Google a couple of years ago and it follows a single vendor strategy. As Dirk Riehle introduced some time ago it is a kind of a economic paradox that a company can earn money making its product available for free as open source. But companies are not NGOs, they don't give away money without expecting something in return, so where is the trick?

As a libre software project Android did not start from scratch, it uses software that would be unavailable for non-libre projects. Besides that, it has a community of external stakeholders that improve and test the latest version published, help to create new features and fix errors. It is true that Android is not a project driven by a community but driven by a single vendor, and Google does it in a very restricted way. For instance external developers have to sign a Grant of Copyright License and they do not even have a roadmap, Google publish the code after every release so there are big intervals of time where external developers do not have access to the latest code. Even with these barriers there are a significant part of the code that is being provided from external people, it is done directly for the project or reused from common dependencies (GIT provides ways to reuse changes done to remote repositories).

The figures above reflect the monthly number of commits done by people split up in two, in green colour commits from mail domains google.com or android.com, the study assumes that these persons are Google employees. On the other hand in grey colour the rest of commits done by other mail domains, these ones belong to different companies or volunteers.

According to the first figure (on the left), which shows the proportion of commits, during the first months that were very active (March and April 2009) the number of commits from external contributors was similar to the commits done by Google staff. The number of external commits is also big in October 2009, when the total amount of commits reached its maximum. Since April 2009 the monthly activity of the external contributors seems to be between 10% and 15%.

The figure on the left provides a interesting view of the total activity per month, two very interesting facts here: the highest peak of development was reached during late 2009 (more than 8K commits per month during two months). The second is the activity during the last months, as it was mentioned before the Google staff work in private repositories so until they publish the next version of Android, we won't see another peak of development (take into account that commits in GIT will modify the history when the code is published, thus the last months in the timeline will be overwritten during the next release)

More than 10% of the commits used by Google in Android were committed using mail domains different to google.com or android.com. At this point the question is: who did it?

(Since October 2008) # Commits Domain 69297 google.com 22786 android.com 8815 (NULL) 1000 gmail.com 762 nokia.com 576 motorola.com 485 myriadgroup.com 470 sekiwake.mtv.corp.google.com 422 holtmann.org 335 src.gnome.org 298 openbossa.org 243 sonyericsson.com 152 intel.com

Having a look at the name of the domains, it is very surprising that Nokia is one of the most active contributors. This is a real paradox, the company that states that Android is its main competition helps it!. One of the effects of using libre software licenses for your work is that even your competition can use your code, currently there are Nokia commits in the following repositories:

  • git://android.git.kernel.org/platform/external/dbus
  • git://android.git.kernel.org/platform/external/bluetooth/bluez

This study is a ongoing process that should become a scientific paper, if you have feedback please let us know.

CVSAnalY was used to get data from 171 GIT repositories (the Linux kernel was not included). Our tool allow us to store the metadata of all the repositories in one SQL database, which helped a lot. The study assumes that people working for Google use a domain @google.com or @android.com.



Categories: FLOSS Research

AR interface in Android using phoneGap

LibreSoft Planet - Tue, 2011-03-29 06:51

Since 6 months ago we have evaluated the possibility to implement a new AR interface (based in our project ARviewer) using phoneGap. phoneGap is a mobile framework based in HTML5/JS that allow execute the same source code HTML5 in differents mobile platforms (iphone, android, blackberry). It seem a good way to create portable source code. Since 3 years ago I work in this project with Raúl Román, a crack coder!!

Currently using phoneGap is not possible obtain the stream camera in the webView widget. So, this part of the source code must be developed in the native platform. We find another problem. We could not put the webview transparent so it would look the camera in the background, and paint objects on top with HTML. In this case, we asked for this to David A. Lareo (Bcultura) and Julio Rabadán (Somms.net) and gave us some very interesting clues about this problem.

The solution is implemented in the source code that you can see below. It's necessary that our main view (R.layout.main) is the main view, for this we do 'setContentView' and later we add the main view of 'DroidGap' using 'addview' and 'getParent'. Once we have our view mixed with phonegap main view, we set the backgroundColor transparent.

@Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); super.init(); super.loadUrl("file:///android_asset/www/index.html"); setContentView(R.layout.main); RelativeLayout view = (RelativeLayout) findViewById(R.id.main_container); // appView is the WebView object View html = (View)appView.getParent(); html.setBackgroundColor(Color.TRANSPARENT); view.addView (html, new LayoutParams(LayoutParams.FILL_PARENT, LayoutParams.FILL_PARENT)); appView.setBackgroundColor(Color.TRANSPARENT); }  

  Currently, we have started this project so I will post the full source code in this blog
Categories: FLOSS Research

Amarok Code Swarm

Paul Adams: Green Eggs and Ham - Fri, 2009-08-21 06:52

In my previous entry it was commented that it would be nice to see a code swarm for Amarok's history in SVN. Well... go on then.

Code Swarm is a tool which gives a time-based visualisation of activity in SVN. Whilst code swarm are often very pretty and fun to look at for 15 minutes, they are not very informative. Much of what appears is meaningless (e.g. the entry point of the particles) and some of it is ambiguous (e.g. the movement of particles).

Anyhow, I was surprised to see that someone hadn't already made one of these for Amarok. So, here it is:

Amarok Code Swarm from Paul Adams on Vimeo.

Categories: FLOSS Research

Amarok's SVN History - Community Network

Paul Adams: Green Eggs and Ham - Thu, 2009-08-20 09:08

I did not include a "who has worked with whom" community network graph in my previous post on the history of Amarok in SVN. This was largely because that blog post was written quite late and I didn't want to wait ages for the community network graph to be generated.

Well, now I have created it.

Click here for the complete, 8.1MB, 5111x3797 version

So, just to remind you... SVN accounts are linked if they have both worked on the same artifact at some point. The more artifacts they share, the closer together the SVN accounts are placed. The result of this is that the "core" community should appear closer to the middle of the graph.

Categories: FLOSS Research

Amarok's SVN History

Paul Adams: Green Eggs and Ham - Tue, 2009-08-18 17:25

So, as you might have recently seen, Amarok has now moved out of SVN. This was SVN r1002747 on 2009-07-26. Amarok first appeared in /trunk/kdeextragear-1/amarok on 2003-09-07 (r249141) thanks to an import from markey. It was migrated the to simplified extragear structure (/trunk/extragear/multimedia/amarok) at r409209 on 2005-05-04.

So, to mark this event I have created a green blob chart and a plot of daily commits and committers for the entire time Amarok was in SVN.

Simply right-click and download the green blobs to see them in their full-scale glory. I'm sorry the plot isn't too readable. It is caused by a recent day where there appears to be about 300 commits in Amarok; way above the average. I assume this is scripty gone mad again.

Categories: FLOSS Research

Archiving KDE Community Data

Paul Adams: Green Eggs and Ham - Sun, 2009-08-16 08:23

So me and my number crunching have been quiet for a couple of months now. Since handing in my thesis I have been busier than ever. One of the things keeping me busy has been to make good on a promise I made a while back...

I have, for some time, promised to create a historical archive of how KDE "looked" in the past. To achieve this I have created SVN logs for each calendar month of KDE's history and ran various scripts that I have against them. Here's a few examples for August 1998....

Community Network

A community network graph is a representation of who was working with whom in the given month. You may remember that I have shown these off once or twice before. The nodes are SVN accounts and they share an edge when they have shared an artefact in SVN. The more artefacts that the pair have shared, the closer they are placed together. The result is that the community's more central contributors should appear towards the middle of the plot.

Your browser does not support SVG.

The Green Blobs

Yes, they're back! For the uninitiated the green blobs are representation of who committed in a given week. Read the chart from top to bottom and left to right. The date of the first commit and the % of weeks used are also given.

Commits and Committers

I have also gathered basic number of commits and committers per day and created plots, like this...

Your browser does not support SVG. So, I now have a few things to do:
  • Firstly, I need to find a place where I can store all these images and the source data where they are easily accessible. They will go online somewhere.
  • Secondly, I need to keep taking logs and keeping this archive up-to-date.
  • I also need to create a sensible means for generating SVG versions of the Green Blobs. This was an issue raised back at Akademy in Glasgow and still hasn't been addressed. I'm generally moving all of my visualisations to SVG these days.

In time I will add visualisations for things like email activity as well. If you have any ideas of aspects of the community you want visualised just let me know and I'll see what I can do. In particular, if you want me to run these jobs for your "bit" of KDE (e.g. Amarok, KOffice), just give me a shout and I'll see if I can make time. Better still, why not lend me a hand? Once I have hosting for the visualisations I will be putting all my scripts up with them. Finally.

Whilst the historical data has been visualised for interest, I hope that the new charts, as they are produced, will be helpful for all sorts of activities: from community management and coordination to marketing. Oh... and research, of course.

Categories: FLOSS Research
Syndicate content