Linuxcon Europe – Day 3

Posted on November 8th, 2012

Catarina Mota – Open Hardware

Mota, a distinguished resesearcher and founder of the Open Hardware Association  opened Day 3 with her keynote on open hardware. She explained how open hardware was lowering the barrier to entry to a number of fields such as DIY 3D printing, DIY biomedicine and many others – by making it easier to get involved.

She told the story of Sparkfun, a company which was started through the need to source parts for open hardware. Parts were shipped from someone’s basement – and they’ve now grown to have a turnover of over $20 million annually.

She went on to demonstrate how the growth of Arduino has been exponential and how ‘backyard brains’ are now changing the world through open hardware, producing projects like the global village construction set – which is essentially lifesize lego. 3D printing is also starting to reach maturity, with technology such as Repraps and Makerbots having more of a presence in the community.

This growth has led to the birth of the Open Hardware Association as a body to help support the community as it grows up.

Mota drew a number of differentiations between open hardware and open software. Asking the question

What does open mean for hardware?

she explained that the assets used in open hardware were things like bills of materials, schematics and CAD designs, and that standard formats for the production of these were again lowering the barrier to entry of the community. A challenge remains in terms of an equivalent repository to something like GitHub or SourceForge for open hardware.

Some of the other challenges facing the open hardware community were things like sourcing local parts and materials. For example, the specialist plastics used by Reprap and Makerbot are difficult to source and expensive if you live in the developing world.

Mota also explained how open hardware works through iteration, in contrast to open software which works through a process of collaboration. Each hardware design is an iterative and possibly derivative design – rather than collaborative.

Small and medium scale component manufacturing also remains a challenge – for instance breadboard designs are not efficient to produce on a small scale. However, there is now a lot of momentum to aggregate these – say 4 or 5 developers having their boards printed at once by a manufacturer to reduce costs – effectively leading to the crowdsourcing of manufacturing.

Mota illustrated how open hardware is leading to a rebirth in tinkering and repair – arts that have been lost within a consumerist society. Technology like Ardunio is being used in education to pique the interest of students in technology, and the business model that is currently used -

give away the bits, sell the kit

is still apparently viable.

Mota concluded that 2013 is the year of open hardware. The community is on the cusp of some major developments – such as breaking through into markets such as consumer electronics – and there are exciting if uncertain times ahead.

Mota’s presentation was one of my favourites so far of LinuxCon – not just because it was from a female presenter who is so well respected within the industry, but also because her presentation style and general demeanour were so respectful and humble. She was one of only a handful of presenters who thanked the Linux Foundation for the opportunity, and her whole attitude of wanting to learn and grow from the open software community was lovely.

Linux Torvalds – Where Linux is going

The room was jam packed to hear the Linux superstar being interviewed on stage by Dirk Hoendl. His first question what what he was most proud of – which he answered with what can be achieved by the community as a large body. In fact, it is the community, and what they do with Linux that keeps Linus so motivated. Technologies led by the community but founded on Linux – such as the one laptop per child project are an example of what the community can do together. He also noted that he finds some of the smaller open source projects to be the most interesting.

When asked by Dirk which award he was most proud of, he stated that he didn’t really care about awards, and again posited the community as his key motivating factor. His key quote here was definitely worthwhile;

There’s no Nobel Prize for computer science

When asked what concerns him about the future of opensource and Linux in general, Torvalds stated that a number of opensource projects that he’s seen have less direction than they could have, and that they would benefit from taking direction from outside. He was also a little concerned over personality clashes within the community – stating that while disagreement is a catalyst for growth, sometimes it can be quite personal.

He also mentioned that the patent system was fundamentally broken- an unsurprising claim from an open source advocate. Similarly, he talked about how embedded Linux remains a challenge – particularly for things like mobile phones – here referencing the divergence between the Android kernel and the main Linux kernel (although now they are set to merge again). He explained how supporting embedded systems can be a lot of work for kernel developers.

When asked who is the next Linux Torvalds, the next innovator, he sidestepped the question by stating that

sometimes the old ways are the correct ways

particularly with respect to operating systems.

Torvalds also mentioned the lack of diversity in the community when asked about he age of kernel developers steadily increasing – citing the fact that many are now employed by tech firms once they become developers, in contrast to students and younger developers several years ago. He broached the gender question himself, stating that 99% of Linux developers were male, and that the community needed more female developers, but that he wasn’t sure how to fix this.

I thought this showed a lack of insight into the work of groups like the Ada Initiative (inter alia), who have done research into open culture participation by women. This research has led to concrete, tangible recommendations for steps that can be done to increase participation by women.

Stephen Hemminger – Taking the fear out of contributing

Vyatta’s Hemminger set the scene by stating that it is hard to start contributing to open source generally – and to the kernel in particular – when there isn’t a lot of support – you can be easily scared off.

He covered the different reasons why people contribute to open source, such as wanting to learn, being intellectually curious, needing a challenge and to a certain extent being altruistic. He also mentioned that people wouldn’t do it if it wasn’t – at least in some way – fun.

Hemminger gave an overview of what the kernel contribution process is supposed to look like – and contrasted this with what it *is* actually like for developers – where there is a lot of flaming and trolling in the mix instead. In his experience, he has found encouragement and feedback really important – particularly when navigating a community where fiefdoms and political battles exist.

He covered three main cases of submission failure;

  • Big changes – Massive changes are less likely to be accepted because they are harder to review and assure
  • Arrogance – an unwarranted sense of pride in ability can detract from how the utility of a patch is perceived by the community
  • Divisive changes – Changes that are incompatible, are proprietary, are reinventions or just plain ugly are likely to fail.

So, how can this be addressed? Hemminger drew a number of analogies with his involvement in Toastmasters, where the process for evaluating public speeches is highly formalised. The techniques used by Toastmasters, such as being prepared, active listening and sandwich techniques for giving feedback, focusing on the presentation and not the person – were all worthwhile techniques to improve kernel submission evaluation.

Hemminger also touched on the role of mentoring within the community, and explained how local mentoring is generally better – as face to face can be very productive. Having the role of mentor clearly documented helps  – with functions such as being a local dvocate, and a cheerleader for the person.

This presentation was really about culture change within the Linux community – and while the assertions were made I remain skeptical that they will be adopted widely.

LinuxCon Europe 2012 – Day 2

Posted on November 7th, 2012

Mårten Mickos – Open Source Cloud Platforms

Mårten Mickos of Eucalyptus opened day two with a presentation on open source cloud platforms. He opened by naming the big four open source cloud platforms – OpenNebula, OpenStack, CloudStack and Eucalyptus -  the ‘four beautiful open source sisters’. He drew further analogy by explaining that Starbucks is the public cloud – you know exactly what you can get, and that the home espresso machine you have on your bench is the private cloud. They serve different purposes.

He went on to explain the different types of cloud – public, private, hybrid, and mobile  – and asserted that everything will go mobile eventually – a prediction I wholeheartedly agree with. As a Eucalyptus rep, he was obviously outlining where Eucalyptus’ services sit, and he positioned them between the public cloud and data centre – where people might be making their first move from the data centre to the cloud, or in reverse where there are some services they wish to bring back in house.

Mickos’ stated that Eucalyptus understood the desire of developers to have their workloads liberated from the underpinning cloud platform – and to not be locked in to any one platform. This ability is scaring some hardware vendors, as increased core and node utilisation from leveraging cloud services means that less hardware is required.

Mickos went on to opine the four open cloud offerings were a source of better innovation through cross pollination – their competitive collaboration made each strive to be better. In conclusion, Mickos stated that the open cloud provides developers with freedom of environment, scale and deployment.

Monty Taylor – Growing an open source community

Taylor, of OpenStack fame, demonstrated how they grew their community through a fundamental commitment to openness. This starts from the basic belief that anyone can participate and contribute in the project, and that there is a sense of liberte, egalite and fraternite throughout. The freedom of the project also needs to be assured through ascribing an appropriate license, and the governance models and process of the project also need to be egalitarian. While a benevolent dictator (in the form of an individual or a large corporate sponsor) may seem like a positive thing, this can be offputting for meritorious contributors. Taylor explained how open, participative design summits – copied from the Ubuntu model – were also used to help cohesion within the community. He further explained that the repositories for a project also need to be transparent and open, and that code reviews were necessary to ensure quality, but they also needed to be transparent and open in nature.

The key takeaway here was that in order to grow a community in opensource, everything about it needs to be open – the code, the governance, the processes, the contributions etc.

Matt Asay – Picking the next black swan in open source

Matt Asay, who works with 10gen (a mongoDB company), presented on how to pick the next ‘winners’ of open source. He opened by sharing some of the biggest ‘black swan’ events of the past 40 years – such as ‘no one will every need more than 640kb of RAM’ – and highlighting that it’s difficult to predict the future.

One key principle that’s often employed is to ‘follow the money’, but Asay demonstrated that this is not always true. In open source, in contrast with the commercial sector, money and profits are not always a harbinger of success. Some open source companies make massive technical breakthroughs without being financial successes.

Instead, he encouraged those interested to observe the user and developer ratios of projects – and quoted a figure that for every 1000 users, there are 10 bug reporters – and one developer. So the ratios are important.

Other factors to consider when picking open source winners included whether the project was prepared for participation by having assets such as documentation, modularity, accessibility to code and to knowledge of the project, a solid codebase and a license to fit the need. Where the code was hosted was another factor, and Asay highlighted how the market dominance of SourceForge and SVN was being usurped in recent times by GitHub.

He also encouraged us to follow the developers and the data – as big data is getting bigger all the time, with technologies such as Hadoop, NoSQL and analytics having a larger role to play int he future. Big data is big on processing, and big on storage, and this has forced companies such as Google, Facebook (viz. Cassandra) and Amazon to write their own stuff – big data was the driver. From there it was not a massive leap to show how to pick open source winners by following job market trends. Buzzwords such as HTML5, iOS, Android, Puppet, Hadoop et all weren’t even invented 2-3 years ago.

He concluded by stating that in open source, we build what matters. We innovate in technology – and open source is focussed on big issues – not – for example’s sake – on finding ways to get people to click more advertisements.

Imad Sousou – Linux at Intel

Sousou opened with an explanation of Moore’s Law, and how it has more or less held true for the last forty years. He then applied the law to the automotive sector – and if a VW Beetle followed Moore’s Law, it would go at over 300,000 km/h and run for 5cents a week! The type of constant innovation that allows to evolve at such a rapid pace takes a lot of dedication, commitment – and investment.

Sousou demonstrated how Intel had played a key role in Linux communication and many open source projects such as Wayland, dLeyna and many others, with the overriding theme being the development of apps in web technologies. Here, two challenges are still present – API completeness and performance. The W3C APIs simply don’t cover what an application needs to do, so Intel have helped create the System Application working gorup within the W3C to improve this. They are also working on web technologies performance, to help improve things like fluid animation. Their investment in automotive Linux is also to be noted – and one wonders whether it will be long before we have Linux not just in the the desktop, the data centre and in our mobile devices, but also under the bonnet.

Ralf Flaxa – Enterprise Linux Evolution

Flaza, VP Engineering at SuSE opened by stating that he didn’t want to give us the standard sales pitch. Instead, he told us a story of how he became involved in Linux – and it was all because he wanted a serial driver. From his first Linux Kongress in 1994, he still feels the sense of collaboration and community, even though many members of the Linux ‘family’ are now his direct competitors.

Much of his talk echoes previously covered themes – such as what it means to be ‘open’. To be truly open means being open in many ways – open source, open licence, open community, open governance, open repositories, open to invite the competition and open to contribution.

He gave a number of hints on how to achieve this such as

  • grant influence only to contributors
  • welcome contribution in any form – documentation, code, money, testing
  • encourage beginners and lower the barrier to entry
  • remember to give credit and recognition
  • strive for the best possible code base
  • keep things simple by modularising them and breaking them down into digestible bits

Conference Drinks sponsored by Intel at Casa Battló

#linuxcon drinks thanks to @intel

LinuxCon Europe 2012 – Day 1

Posted on November 6th, 2012

The first day of LinuxCon Europe 2012 went brilliantly. After landing in Barcelona after a short hop from Edinburgh via London City, I had a great night’s rest, if a little jetlagged, and woke up early and rearing to go. The only bugbear with the hotel room (which was spacious and well appointed) was that the hotel internet was as slow as a wet week.

Conference registration was very smooth, with my badge printed on the spot based on my emailed registration number. The Rego Desk staff were also very welcoming and friendly – always a good omen for what a conference is going to be like – and easily recognisable from their (black) t-shirts.

Speaking of t-shirts, I nearly died when they had one in my size! Oh yes, I am liking this conference. It’s OK, no t-shirtgate this time around :-) Of course, it will look better when spiced up with some Ardunio lilypad goodness.

#linuxcon tshirt #breaking IT FITS!! Needs some #arduino #lilypad :-)

Next, it was time to plan my schedule for the day, which was made much easier by the use of mobile friendly schedule planning tools. One of the key gaps that I think we had when running linux.conf.au earlier in the year was the lack of mobile support that ZooKeepr, the conference software that we used, had.

The projection and screen displays were top notch, particularly in the keynote venue. Audio and acoustics were great, and speakers could easily be heard. Apparently the event was being live streamed – I didn’t check, but the audio was certainly good enough for live streaming.

The one thing I didn’t see but expected to was more women – Leslie Hawthorn was the only lady I recognised on Day 1, but maybe there are more coming later in the week. After the 10-15% female registration at linux.conf.au, it did seem a bit unusual. The lack of female attendance also drew attention on Twitter – not in an unwelcoming way, just observational.

There were big names amongst the sponsors this year too – with Intel and Qualcomm major sponsors. In a move that I also found surprising at Open Source Systems, Microsoft were again gold sponsors. This seems to fit with their cloud strategy – with products such as Azure becoming a key product in their portfolio.

Introduction – Jim Zemlin

Executive Director of the Linux Foundation, Jim Zemlin officially opened proceedings, remarking on the current trend of collaborative open development, and invited attendees to read the Linux Foundation’s latest IDC whitepaper Linux and open cloud http://www.linuxfoundation.org/publications/linux-foundation. He also welcomed new and upgrading members of the Linux Foundation, and welcomed the Automotive Linux Initiative – lamenting that he would have ordered a Linux BMW already – if only he could get is wife to agree. Zemlin concluded by encouraging the audience to vote for Linux in the ‘biggest social impact’ category of the 2012 TechCrunch Awards – the ‘Crunchies’.

Zemlin also used one of the breaks to recognise the outstanding achievements of long time Linux advocate Mr Masahiro Date, who is soon to retire from Fujitsu.

Advancing the User Experience – Mark Shuttleworth

Mark Shuttleworth, founder of Canonical and a luminary in the Ubuntu community, made the case for ‘connecting the dots’ for the entire user experience across the entire Linux platform – thus making developers more comfortable on Linux – and with a plethora of choice at their fingertips – and making Linux their preferred option.

Shuttleworth explained that in order to do this, the ‘operational friction’ of using Linux needs to be reduced – it needs to be a smooth, seamless experience, and help developers and sysadmins to manage the increasing complexity they are faced with. Indeed, he described a key failure of Linux, and a possible reason for its delayed dominance as its failure to produce clarity of user experience.

Reducing operational friction can be achieved by putting effort into user experience and by using design thinking – the philosophy behind Canonical’s JuJu offering. Based on the concept of ‘charms’, JuJu attempts to distill applications and knowledge about them into the cloud – encapsulating and reusing heterogenous application components.

Shuttleworth went on to describe the convergence of all devices to one platform. Clients will be doing less processing, which means that the heavy processing will be done in the data centre. The cost of VMs is decreasing, and the cost of desktop PC management is increasing. We are facing a thin client/thick cloud world. His vision is that there will be a “common version of Linux on every device across the institution”.

Mostly Sunny – Dave Engberg

This presentation by Engberg, Evernote‘s Chief Technology Officer, was one of my personal highlights. He went through a detailed business case and cost comparison of why they have chosen to host their own 400-500 boxen in a seemingly contrarian manner to current best practice. As he explained, it all comes down to

“What is the cloud good at, and is this what you need?”

Engberg worked through several slides which compared performance features of their in-house setup, and what it would cost them to replicate this capability using cloud solutions. The application characteristics for Evernote – large storage, high use of meta data and indexing for their Lucene database – are their highest costs. They rarely see high CPU spikes, and so purchasing cloud services doesn’t meet their needs.

Their total cost in house was around $90.5k per month compared with going to cloud services at a cost of $182.5 – $284.3k per month. Of course, as Engberg noted, your organisation has to factor in labour costs and risk into the equation – and make the decision that is right for your organisation. Essentially, his point was that you should be using the cloud unless you can justify not doing so – in Evernote’s case, he had.

To me this presentation was an excellent use of metrics, and a great example of why it makes good business sense to have a solid understanding of total cost of ownership of services – so that you can make informed strategic decisions about procurement and sourcing.

Xen in the cloud – Lars Kurth

Kurth introduced this topic by providing a brief history of the Xen Project and explaining the two key architectures of hypervisors – the first being the bare metal hypervisor, such as ESX and VMWare, with higher security, and the second being hosted and which runs within the host environment. He explained that the Xen hypervisor uses a Dom 0 kernel – essentially the hypervisor roots to the Dom 0 kernel, and thus there is an advantage in re-using the kernel. However, Xen is not in the kernel itself – by everything you need to run Xen is – Xen packages are in most distros.

He went on to explain the concept of disaggregation and the driver/stub/service domain model where applications are deprivileged and isolated. He also explained the concept of paravirtualisation, and the virtualisation spectrum of options from fully virtualised to paravirtualised.

I fell asleep from jetlag at this point and had a siesta :-)

Metrics for open source projects – Dawn Foster

“The right metrics for my project are not necessarily the right metrics for your project”

Foster opened her talk by explaining that metrics are useful for a wide range of reasons – who contributes, where, what they’re interested in, to help recognise contributors and so on. The open source community uses a whole range of different tools, and these can be measured in different ways. However, as she pointed out, it’s important that you know from the start what the objectives of your project are. Do you wish to grow your number of contributors? Do you want to resolve a number of bugs, or squash a number of outstanding ones? You need to know what you want to measure and why – how will it drive action in your open source project?

Some of the tools she covered included

  • mlstats – a command line tool for mail which can analyse mailing list data, content and top contributors
  • google groups – which sadly has no API and had to be manually scraped
  • IRC stats – where a range of tools such as irssistats, pisg and superserious stats were used depending on log file format
  • gitdm – for measuring git contributions
  • trac – for bugs
  • graphite – for visualisation
  • gather – for collecting data

The path to open source virtualisation – Adam Jollans

Jollans’ talk opened with the top 3 external factors that CEOs believe impact their business – they were

  • Big data – massive amounts of data being processed
  • Dispersed mobile workforce
  • Inefficient use of capacity, with around 85% server idle time

The key technologies he saw for innovation were

  • Analytics – where tools such as Hadoop have a role to play
  • Mobility
  • Virtualisation – where tools such as KVM, Xen etc have a major role. Virtualisation is the foundation of the cloud.
  • The Cloud – where tools such as OpenStack etc are playing major parts
  • Security

Essentially, Jollans was arguing that the major problems CEOs face today are solved with Linux.

In regard to virtualisation specifically, he demonstrated that many companies are now running more than one hypervisor in their environment, with the key reasons for multiple hypervisors being technical reasons between solutions, and the cost factor of having multiple installations of expensive hypervisors such as VMWare/ESX.

He explained that KVM (kernel virtual machine) plugs into Linux (it’s a type 1 bare metal hypervisor), and that QEMU provides I/O virtualisation. Over time, it’s only been worthwhile virtualising because you need a very big machine to reap benefits – and he explained that IBM has been investing heavily in virtualisation for many years – with scalability and security being the key focus points.

Jollans went on to explain that IBM has founded the Open Virtualisation Alliance for promoting open source virtualisation.

He also presented use cases for KVM, including;

  • Linux server consolidation – Linux server consolidation is not as widespread as Win server consolidation (yet), and KVM is an easy way to achieve this
  • Hypervisor diversity – many businesses are choosing to have hypervisor diversity to suit different requirements
  • Virtual desktops – VDI is on the increase and needs high levels of memory scalability.

Linux at the forefront – Brian Stevens

Stevens, the Chief Technology Officer and Vice-President of Worldwide Engineering for RedHat opened by advocating that Red Hat had achieved ’20 years of disruption’ – with the customer value of Linux driving wider adoption. He outlined five principles on which Red Hat business model was founded;

  1. Invest in the advancement of Linux
  2. Enablement – hardware capability
  3. Facilitate fast upstream development
  4. Ecosystem – it’s not about Linux, it’s about the ecosystem
  5. Boring – enable hardware upgrades without churning the application stack

With this model, RedHat eclipsed over $1 billionin revenue last year, but Stevens insists that open source is not a business model, rather it is

“the best development model on the planet – modular innovation which can be consumed incrementally”

Stevens went on to show that Linux is at the heart of most hottest technologies at the moment – and that if you build your application on Linux it will run anywhere.

He highlighted RedHat’s OpenShift offering and how it supports some of today’s challenges – such as constantly growing data, and data which is less structured than it used to be. To underscore this, his presentation was done in reveal.js and hosted off OpenShift.

Kernel report – Jonathan Corbet

Corbet highlighted that over 400 employers contribute to the Linux kernel, but that over time voluntary contributions have been dropping. Reasons for this include the fact that skilled volunteer contributors are readily hired, but that we might as a community need to encourage newer contributors to join the fold.

Mobile and embedded participation is also growing rapidly, mirroring developments in the wider technology sphere. Linux is also leading networking developments, and ARM architecture is starging to dominate. Security efforts, once-neglected, are now being revamped.

UEFI secure boot is still a challenge, but solutions do now exist for Linux, but only as long as Microsoft continues to be co-operative. This is a cause for concern – as Microsoft could easily withdraw their support, and this issue should be watched.

On the filesystem from, ext4 is still the key workhorse, but btrfs is starting to mature and stabilise.

On the gap side of things, regression tracking is now a key gap area and something which Corbet encouraged the community to become involved in.

 

© Klog: Kathy Reid’s Blog • Powered by Wordpress • Using the Swiss Cool theme.