Categories
main mozilla po polsku tech

Wywiad O Fx4 i nie tylko

Aż trudno uwierzyć, że nie pisałem tu od 2 miesięcy… Zagrzebałem się w projekt L20n tak głęboko, że nie znajdywałem w sobie motywacji do blogowania. :/ Obiecałem sobie w tym roku to zmieniać, bo wierzę, że komunikowanie tego co się robi pomaga całej społeczności – zatem spodziewajcie się więcej i krzycie jak będę lagował 😉

Na początek roku podrzucam wywiad który ukazał się na firefox.bajo.pl – miłej lektury i czekam na opinie 🙂

Categories
main mozilla tech

Mozilla Summit 2010 – Localization 2.0 talk

Here come slides I used for Summit 2010 Localization 2.0 talk.

It was a very tough talk to give. Hard to grasp, hard to explain. I originally wanted to devote it exclusively to L20n, and make it as a form of tech talk, but eventually figured out it will not work and there’s much broader vision I need to explain. Thus a few hours before the talk I started rewriting it and end up with what you can find here.

Of course slides alone will tell you just a small part of the story, but it’s better than nothing. 😉

Thanks to all who participated in this! I know it eats a lot of brain cycles to process and it was already 4pm, but I hope you enjoyed it! 🙂

Categories
main mozilla tech

Pontoon – more details and 0.2 plans

Catching up from the blog post that introduced Pontoon, I’d like to present the project in a little more detail and the vision for a near future.

Clients

As stated previously, Pontoon 0.1 introduces two types of clients – web client and jetpack client. Both share majority of code base (Pontoon, jQuery, jQueryDomec and heavily modified editableText plugin), but because of the limitations of jetpack prototype I had to merge them into one JS file.

user experience

For user interface of Pontoon 0.2 I’d like to focus on a few key aspects that may make Pontoon a tool ready to use in work environment:

  • Introduce identification system that will sign contributions made by a given user. We may use Verbatim accounts for now.
  • Figure out a way to recognize the state of localization and allow for continuing localization.
  • Add ability to view/edit entity in a source view. (much like wordpress’s HTML view)
  • Fix some common cases like entity with a single anchor should treat the anchor as an entity, not its parent
  • Improve UI to look nicer and inform user on the outcome of his actions.

With those six improvements (including two top ones which are major), Pontoon will become, in my opinion, ready for real life use and we may offer it for one of the upcoming small websites for those brave souls who would like to test it.

code base

On the code base front, I will definitely want to clean up the sources, which will be easier when jetpack client get updated to new jetpack architecture. editableText plugin will require a cleanup and will be extended to allow source view, so it may make sense to fork it as a separate project, while other classes should be refactored and site-dependent code should be moved away from libraries. Also, CSS inclusions should be merged into libraries.

One thing I’d like to see soon is a third type of a client – one that may be embedded into any website so that the website author can turn on and off the “localization” mode.

Server

Pontoon server will require adaptation to the client side changes, like ability to identify/authenticate users, and then two new things should be possible with the data received from a client. First, server should be able to add translations as suggestions in Verbatim. This way users will be able to suggest translations, leaving up to the registered localizers what to do with it. This workflow is an intended one for Pontoon.

Second thing, is that we should be able to generate a pure, localized HTML file out of a source one and translation list. This way we will be able to localize things like SUMO documents or plain HTML sites that do not use Gettext or even PHP.

Hooks

Hooks will go through a major rewrite, because I want to test the concept of providing an external meta file with information like lists of entities for the client to read. Such meta information is the most natural way to also add ability to pick up localization from a midpoint. The issue here is that of course this will not help the client with localization of websites that do not use pontoon hooks.

It would be also great to add Django hooks, since so many websites this days are migrating to django, and it would also allow us to test how cross-linguistic the hook system is.

Summary

This is my plan for a short term, and I’ll try to get as many of the features described above into Pontoon 0.2 as possible. I prepared a wiki article describing the plan, and I’m open to feedback, ideas, and of course, contribution. 🙂

Categories
main mozilla tech

Pontoon – introduction

One of the three core types of content I described in my previous blog post is what I call rich content.

This type of localizable content has certain characteristics that are very different from the common UI entities. Starting from different goal (to provide information), through different size (long sentences, paragraphs), different l10n flexibility (needs ability to reorder, extend, shrink long texts) to much richer syntax (sentences can be styled with HTML, CSS).

Almost a year ago I started playing around some type of a web tool that would allow for WYSIWYG style localization for the web, later picked up by Ozten in his blog post about potential use cases, and finally landed on Fred Wenzel’s mind as he decided to give this idea a try. He ignited the project, and created first iteration of the UI, where I joined him, and added server side.

Now, the idea itself is neither new nor shocking. Many projects were experimenting with some forms of live localization for web content but the solutions we know so far are hardly adaptable for generic purposes. Some require a lot of prerequisites, others are gettext only, and they come as black-box solution which requires you to follow all the procedures and lock your process in.

Pontoon is different in the sense that it is a toolset that allows for rich content localization with grading amount of prerequisites and varying strictness of output. It’s a very alpha mode tool, so be kind, but I think it’s ready to get the first round of feedback 🙂

Pontoon – components

Pontoon is composed of three elements:

  • client – pontoon client is an application that allows for content navigation and localization. It’s responsible for analyzing content and providing ways for the user to translate it.
  • server – pontoon server is an application that the client communicates with in order to send/update translations provided by the localizer and store them later for the website to use.
  • hook – pontoon hook is a small library that can be hooked into a target application in order to provide additional information for the client to improve its abilities.

There are various clients possible, for now we have two – HTML website as a client, and Jetpack 0.8.1 based one. They share a lot of code and they operate quite similarly.

We have one server – django based one – that can receive translations provided by the client, and uses Silme to store them into a file (currently .po), and we have one hook – for PHP – that adds special meta headers and provides simple API for modified gettext calls that gives client direct information on where the entities are located, so that the client does not have to guess.

I’ll dig into details in a separate post, there’s much more into how Pontoon can operate and what are the possible enhancements for each of the three components, but for now, I’d like to present you a short video of Pontoon used to localize two old versions of our website projects.

The video is a coffee-long, so feel free to grab one and schedule 6 minutes 🙂

urls used in the video:

notes:

  • the locale codes that are supported by the server are usually ab_CD, not ab – for example de_DE, fr_FR etc.
  • pontoon does not support multiple users at the same time, so you may observe strange results when many people will try it at the same time. Enjoy alpha mode!
  • “0.1 alpha” means – I’m looking for feedback, comments, ideas and contributions 🙂

Categories
main mozilla tech

My vision of the future of Mozilla localization environment (part1)

After two parts of my vision of local communities, I’d like to make a sudden shift to write a bit about technical aspects of localization. The reason for that is trivial. The third, and last, part of the social story is the most complex one and requires a lot of thinking to put it right.

In the meantime, I work on several aspects of our l10n environment and I’d like to share with you some of the experiences and hopes around it.

Changes, changes

What I wrote in the social vision, part 1 about how the landscape of Mozilla changes and gets more complex, stands true from localization perspective and requires us to adapt in a similar fashion as it requires local communities.

There are three major shifts that I observe, that makes our approach from the past not sufficient.

  1. User Interfaces become more sophisticated than ever
  2. Product forms are becaming more diversified and new forms of mashups appear that blend web data, UI and content
  3. Different products have different “refresh cycles” in which different amount of content/UI is being replaced

Historically, we used DTD and properties for most of our products. The biggest issue with DTD/properties is that those two formats were never meant to be used for localization. We adapted them, exploitet and extended to match some of our needs, but their limitations are pretty obvious.

In respose to those changes, we spent significant amount of time analyzing and rethinking l10n formats to address the needs of Mozilla today and we came up with three distinct forms of data that requires localization and three technologies that we want to use.

L20n

Our major products like Firefox, Thunderbird, Seamonkey or Firefox Mobile, are becoming more sophisticated. We want to show as little UI as possible, each pixel is sacred. If we decide to take that screen piece from the user, we want to use it to maximum. Small buttons, toolbars should be denser – should present and offer more power, be intuitive and allow user to keep full control over the situation.

That exposes a major challenge to localization. Each message must be precise, clear and natural to the user to minimize his confusion. Strings are becoming more complex, with more data elements influencing them. It’s becoming less common to have plain sentences that are static. It’s becoming more common that a string will show in a tooltip, will have little screen (Mobile) and will depend on the state of other elements(number of open tabs, time, gender of the user).

DTD/properties are absolutely not ready to meet those requirements and the more hacks we implement the harder it’ll be to maintain the product and its localizations. Unfortunately other technologies that we considered, like gettext, XLIFF or QT’s TS file format are sharing most of the limitations and are being actively exploited themselves for years now (like gettext’s msgctxt).

Knowing that, we started thinking about how would localization format/technology look like if we can start it today. From the scratch. Knowing what we know. With experience that we have.

We knew that we would like to solve once and for all the problem with astonishing diversity of languages, linguistic rules, forms, variables. We knew we’d like to build a powerful tool set that would allow localizers to maintain their localizations easier, and localize with more context information (like, where the string will be used) than ever. We knew that we want to simplify the cooperation between developers and localizers. And we knew we would love to make it easy to use for everyone.

Axel Hecht came up with a concept of L20n. A format that shifts several paradigms of software localization by enabling algorithmic power outside of the source code. He’s motto is “Make easy things easy, and complex things possible” and that’s exactly what L20n does.

It doesn’t make sense to try to summarize L20n here, I’ll dig deeper in a separate blog post in this series, but what’s important for the sake of this one, is that L20n is supposed to be a new beginning, different than previous generations of localization formats, differently defining the contract between localizer and developer called “an entity”.

It targets software UI elements, should work in any environment (yes, Python, PHP, Perl too) and allow for building natural sentences with full power of each language without leaking this complexity to other locales or developers themselves. I know, sounds bold, but we’re talking about Pike’s idea, right?

Common Pool

While our major products require more complexity, we’re also getting more new products that appear in Mozilla, and very often they require little UI, because they are meant to be non-interruptive. Their localization entities are plain and simple, short and usually have single definition and translation. The land of extensions is the most prominent example of such approach, but more and more of our products have such needs.

Think of an “OK” and “Cancel” button. In 98% of cases, their translations are the same, no matter where they are used. In 98% of cases, their translations are the same among all products and platforms. On top of that there are three exceptions.

First, sometimes the platform uses different translation of the word. Like MacOS may have different translation of “Cancel” than Windows. It’s very easy, systematic difference shared among all products. It does not make any sense to expose this complexity to each localization case and require preparing each separately for this exception.

Second, sometimes an application is specific enough to use a very specific translation of a given word. Maybe it is a medical application? Low level development tool or for lawyers only? In that case, once again, the difference is easy to catch and there’s a very clear layer on which we should make the switch. Exposing it lower in a stack, for each entity use, does not make sense.

Third, it is possible that a very single use of an entity may require different translation for a given language. That’s an extremely rare case, but legitimate. Once again, it doesn’t make sense to leak this complexity onto others.

Common Pool is addressing exactly this type of localizations. Simple, repetitive entities that are shared among many products. In order to address the exceptions, we’re adding a system of overlays which allow a localizer to specify separate translation on one of the given three levels (possibly more).

L20n and Common Pool are complementing each other and we’d like to make sure that they can be used together depending on the potential complexity of the entity.

Rich Content Localization

The third type is very different from the two above. Mozilla today produces a lot of content that goes way beyond product UI and localization formats are terrible when dealing with such rich content –  sentences, paragraphs, pages of text mixed with some headers and footers that fill all of our websites.

This content is also diversified, SUMO or MDC articles may be translated into a significantly different layout and their source versions are often updated with minor changes that should not invalidate the whole content. On the other hand small event oriented websites like Five Years of Firefox or Browser Choice have different update patterns than project pages like Test Pilot or Drumbeat.

In that case, trying to build this social contract between developers and localizers by wrapping some piece of text into uniquely identifiable objects called entities and using some way to sign them and match translation to source like we do with product UI doesn’t make sense. Localizers need great flexibility, some changes should be populated to localizations automatically, only some should invalidate them.

For this last case, we need very different tools, that are specific for document/web content localization and if you ever tried Verbatim or direct source HTML localization you probably noticed how far it is from an optimal solution.

Is that all?

No. I don’t think so. But those are the three that we identified and we believe we have ideas on how to address them using modern technologies. If you see flaws in this logic, make sure to share your thoughts.

Why I’m writing about this?

Well, I’m lucky enough to be part of L10n-Drivers team in Mozilla, and I happen to be involved in different ways in experiments and projects that are going to address each of those three concepts. It’s exciting to be in a position that allows me to work on that, but I know that we, l10n-drivers, will not be able to make it on our own.

We will need help from the whole Mozilla project. We will need support from people who produce content, create interfaces and who of course from those who localize, from all of you.

This will be a long process, but it gives us a chance to bring localization to the next level and for the first time ever, make computer user interfaces look natural.

In each of the following blog posts I’ll be focusing on one of the above types of localizations and will present you projects that aim at this goal.

Categories
main mozilla tech

My vision of the future of Mozilla local communities (part 2)

In my previous blog post, I summarized the transition through which Mozilla project went and how it applies on how local Mozilla communities. I explicitly mentioned enormous growth of Mozilla ecosystem, diversification of products & projects, and differentiation of project development patterns which results in different requirements for marketing, QA, support, localization etc.

Now, I’d like to expand on how I believe our local communities now operate.

On Local Community Workload

from: http://www.edc.ncl.ac.uk/highlight/rhnovember2006g01.php/

The result of this growth of Mozilla ecosystem is a rise in a workload that our local communities experience.  With this work comes the challenge to communicate to locales the richness of Mozilla on a local ground.  It seems that localizer workload is mounting high and local communities are trying to find ways to adapt, because:

First, it is not scalable to manage all Mozilla localizations by the team of a size that fit the needs 5 years ago.

Second, localizers are not the only type of people that exist in a local community. There are various tasks which require different skills and different people may find different sorts of motivation to work on different aspects of Mozilla.

It’s pretty easy to get out of balance and try to take more than you can handle when there’s so much going on and you feel in charge of your locale. Some communities are more successful finding their way, some are struggling.

I believe we have to adjust our approach to this new reality.

My opinion on the role of l10n-drivers

Traditionally, a lot of the local engagement work has fallen on the plate of localizers.  The L10n-drivers team then becomes very important in helping local communities manage their workload.  Having participated as an l10n-driver for over one year now, I see how the team became crucial in supporting communities in several ways. It:

  • It makes sure that when we call out for localizations, what you localize will be used for a long time to improve maximum work/value balance.
  • Builds tools that reduce the entry barrier and time spent by localizers on localization and local management tasks around
  • Provides information on projects, their roadmaps, goals and results (metrics) to help localizers make informative decisions on what to localize and when.
  • Supports localizers in solving localization blockers like hardcoded strings, or untranslatable strings to make the results of their work worth the time their spent and that if they want, they can fully localize the product and make it look awesome and natural in their language. (read: one untranslatable string ruins hard work and is a great way to demotivate anyone)
  • Helps adjust roadmaps of projects to minimize the overlap between relases to spread the workload in time.

But, the role of local communities has expanded far beyond just localization.  Our team’s work will not be enough and I think that we have to revise assumptions we all make about what is localization process, and what are our goals.

My opinion of the changing role of Localizers

Localization of Mozilla today is not a single, homogeneous task like it used to be. There are different tasks to take and different people who want to contribute. Some tasks require short spikes of attention once per year (around release), other require bi-weekly contribution, other have no release schedules and just take any contribution. It all requires different amount of energy, focus, attention and time.

And the core goal of localization – to bring the product closer to your local ground – is suddenly becoming a complex toolset. With so many projects to choose from, local communities should stop thinking about them as a single bundle. Instead we should all start recognizing that this diversification allows us to pick what we need.
You, local community members, are the best positioned to make the right decision on which projects are needed in your region. We cannot assume that each region needs the same amount of Mozilla ingredients.

By that I mean not only ability to pick up projects to localize for your region, but also deciding, together with the project leaders, how much of the project should be translated, and what kind of adjustments are required for your culture. It’s extremely important to understand, that sometimes you cannot localize everything, although we all know how much satisfaction we have from “collecting it all“. Sometimes “top10” articles gives better result than “try to figure out how to translate it all“. And sometimes you need to go beyond translation. The “top10” SUMO articles in English may be different than “top10” in your locale, and maybe some aspects of marketing campaign could resonate better in your country if you adjust it to your culture and reality.

Armed with this power, local communities can pick the projects that best resonate with what is needed to promote Mozilla vision in their region and put more effort in those. It’s a great power, and a great responsibility, and we have to trust local communities that they know better than any centralized decision making system can ever know, what is important. And we, owners and peers of the projects have to help local communities make the right choices, and fine tune the ingredients they picked. You, local communities, are in charge here.

Local community

With so many tasks, evangelism, marketing, PR, QA, development, support, localization, that are represented in Mozilla, it may be very challenging to fulfill them all by localization team. Many local communities are working on various aspects of Mozilla project, and what’s common to them, is their regional identity and proximity which allows them to support one another, share resources and find new contributors. I believe it’s crucial to preserve the local identity and that there is a great value for each contributor from around the world to peer with other contributors working on other aspects of Mozilla in their region, but localization is not the only task out there.

And more than ever, we need local communities to cooperate with Mozilla project leaders to find new contributors and grow the communities. Generating new project that attract new contributors is one of the key aspects of a healthy, sustainable ecosystem and it’s true for both, Mozilla project as a whole, and for Mozilla local communities.

In the last part, I’ll try to summarize the state change and give you some ideas to consider.

Categories
main mozilla tech

My vision of the future of Mozilla local communities (part 1)

I know, bold title.

Since I decided to start a blogging week, I see no reason not to start with a major topic I have been working on for a few months now.
  The future of local communities in Mozilla is made of two parts – Social and Technical.

I’ll start with the former, and it’s going to be a long one – you know me.

Notice: This is the way *I* see things.  It is not representative of the l10n-drivers, the SUMO team, the QA team, or the marketing team.

But, it represents the progress of thinking about local communities we’re making right now. It is different from what you saw some time ago, and it may change in the future, it does not represent any kind of consensus, and my peers may disagree with me on some of my points.

A little bit of history

Historically, and by that I mean years 2000-2004, when first strong local communities were constituted, it was centered all around localization.  The localization ecosystem had several characteristics.

  • finite number of projects
  • core of any local community were localizers
  • each product had limited number of strings
  • each product had a release cycle not shorter than 1 year
  • we had limited awareness of localization importance among Mozillaians

Another specific thing about that time was that Mozilla as a community/project started growing faster than Mozilla as an organization.  By this, I mean that people started participating in Mozilla all over the world, sometimes faster than the organization could predict, know about, understand and harness.  It was very independent.  What happened in Poland, was very different from what happened in Italy or U.S. or wherever.  At the days when Mozilla was formally organizing, few people at the “central project” could predict what was happening across the world.  At times, it was very frustrating to them…things were happening so fast, beyond the organization’s control.

As a motivated community, the Internet allowed us all to download the early Mozilla products, and gave us something to gather around.  We did and it was amazing.  People started fan sites, discussion forums, and “news-zines”,  The most determined ones started seeking ways to bring Mozilla to their country.  The most natural way to participate was to localize the product, and by localize I mean various actions that make the product more suited for the local market – translating, changing defaults and adding new features or modifying existing ones.

All this work was usually targeted in two directions – toward local markets, where those early community leaders were building local branches of Mozilla, and toward the Mozilla project to fit the concept of local communities, and the fundamental goal of internationalization of a project into the core of our project culture.

Thanks to that work in those days, today we can say that Mozilla is a global project and we recognize localizability as one of the aspects of Mozilla approach to projects.

But since those days, many things has changed. What was good by that time, may not be enough today.

Growth and Variety

Fast-forward to today:  Mozilla today as a meta-project is producing much richer set of projects/products/technologies than we ever did.

We create many websites of various sizes.  We have blends of websites and extensions (like TestPilot).  We have webtools like Bugzilla, addons.mozilla.org.  We have products likeFirefox, Thunderbird, and Seamonkey. We have a mobile product with higher screen space limits.  We have experiments that are introducing new level of complexity for localization like Ubiquity or Raindrop.  We have more content than ever.

The point is this: local communities represent Mozilla through a diverse set of mature products, early prototypes, innovative experiements, one-time marketing initiatives, and documents like our Manifesto that will live forever.  This means that the work flow has changed dramatically since the early days.  Different projects with different or changing frequencis are becoming the standard for communities to absorb in a new differentiated and highly competitive marketplace.  And, our communities need to evolve to respond to this.

Each product has different characteristics and the local delivery through l10n and marketing means a very different type of commitment.  It now requires different amounts of time and energy, different types of motivation, and different resources.

Additionally, we’re also more diversified in the quest to fulfill our mission. We have regions where modern web browsers constitute vast majority of the market share, where governments, users and media understand the importance of browser choice or privacy and Internet is a place where innovation happens. But, we also have places where it is not the case. Where incumbent browsers are still the majority, where the web will not move forward in the same way it did in the past, where the latest technologies cannot be used, where privacy, and openness sounds artificial.

Recognizing this shift is important factor to allow us to adjust to the new reality where local communities have to expand beyond just localization.  They must become local Mozilla representatives who are experienced in evangelism, marketing, localization, software development, and all other aspects of Mozilla.  We need to get more local and grow beyond the responsibilities of our local communities in the past.

In the next part, I’ll cover some ideas for the future…

Categories
main mozilla tech

In MtV – blogging week

I delayed it way too long and now feel that I need to catch up with a lot of stuff.

So, since I just got to MtV where I’ll spend some time now, I decided to organize personal blogging week where each day I’ll blog about piece of what I’m working on to hopefully catch up with the projects I failed to blog about lately 🙂

On the plate we have, jetpack stuff, various dimensions of l20n, pontoon, survey project, and, for the weekend, some non-mozilla projects as well 🙂

If you’re in bay area and want to share a drink, coffee or socialize in any other way, let me know. And if you’re at 300 Castro, I claimed ownership over a desk next to Seth and Asa. It’s a bit busy here, but I like networking :]

Categories
main mozilla tech

We have an important story to tell!

Hey @flod (and Giacomo!)! You touched interesting topics in your latest post, and when I started crafting my comment it got so lengthy, I decided to use my platform to deliver it. Blog-to-blog discussion style! 🙂

I’ll try to respond, but please, bare in mind, that that’s my personal opinion, nothing official.

You started by pointing out a set of efforts that you either find of questionable value or not “leader” style. Things like Fx UI direction, multiprocessing, Jetpack or Personas.

In particular, you focused on two dimensions:

  1. Are those efforts unique? Innovative? Or do we chase others?
  2. Are those efforts valuable elements that fit into Mozilla Manifesto vision

You question both, and I believe you have all the rights on Earth to do so. We may disagree, but we should talk about this, and I find the fact that you express your concerns in public a good sign of the health of our ecosystem.

So, down to some points you raised. I humbly disagree with your notion of “cloning Chrome”. I believe it is a cognitive impairment that we so easily (we – I mean, most Mozillians I know) buy – this concept of “fresh Chrome”. Chrome is great! But it is not “innovative” in a sense many people talk about it. We just so easily take for granted a lot of inventions we brought to the world, and Chrome, yes, they just looked at Firefox and learned from us. That is just awesome, but that’s what they did!

Once again, Google is looking at an open source project and learns from them how to build a web browser. No, wait! Google, Microsoft and Apple are doing it. Now, how awesome it is? Think of all those things that Firefox brought to the browser landscape since its 1.0 version and notice how many of those innovations are now in IE8, Safari and Chrome.

They also have brilliant developers and *just* bringing all the values of Firefox would be a waste of time, so they, among other things, got a free ride of fixing things we struggled with. And now it is our time to fix those, and there’s nothing unhealthy in this. What would be, in my opinion, unhealthy, is to pretend we don’t see them, and defending our approach as “the right one” (remember Bill Gates first comments on Firefox 1.0?).

Ability to go multi-process is important. Majority of perceived performance improvements that Chrome has (and Fx 3.5/3.6 brought) come from two things: tricking user’s eyes – show him UI 200 ms before its usable – and putting expensive elements off the main thread. (I’m sure that performance team will be able to explain that better). The fact that you have to restart your browser when you install extension is a UX bug. No user expects it or wants it. It does not bring any value and the only reason we have it is a technical limitation.

For years we raised the bar of how the web browser should work. We set the standards in many areas. Opera set some, Safari set some, IE set some as well. Now Chrome set some standards and we just have to match them, possibly using extelligence of our brilliant dev team to push it further and innovate (Jetpack team is far from just fixing issues, they aim for bringing extensions to a new level, and they should be aiming for nothing less than that!). No reason to be worried, we make a great web browser better and it would be unwise to ask our users to trade those nice features for ability to use browser with Mozilla values. Why not give them the best of both?

Personas is an interesting project. I remember my initial feelings when I encountered Personas were much of “eee, nothing interesting”. I considered it to be a minor feature. I recognized that I’m not a target audience (neither you are, I think). But on the day of Fx 3.6 launch I got my lesson when I received amazing amount of feedback from my non-geek friends precisely about Personas and how this project resonated with them. It was amazing for me how emotional people got about “missing Real Madrid Persona, but you have Barcelona one!” or “the pink one is soooo cute” or “my browser is so much more personal now, when my you-name-it favorite actor/actress or symbol of my subculture is here”. Look at the amount of Personas created by people in such a short time! It is an amazing project and only now I see how it fits into Mozilla mission and vibe.

UI on the other hand is a much more complex thing, cause it is related to personal taste and fashion (and fashion itself is, from sociologist point of view a bizarre phenomena of human culture). But imo it all boils down to a simple aspect of cyclical changes. Windows 7 brought new UI, IE8 followed. Chrome followed IE8, Opera followed IE8 or Chrome or both, we’re following W7 or IE8 or Chrome or Opera – you call it. People expect browser to match the visual style of their operating system and Windows 7 is going to be the OS of choice for the vast majority of the world which, in result, will set the UI standard for the OS and apps for quite a some time. We can like it or hate it, but that’s going to happen, and Firefox on Windows should imo fit the OS style. What we will do beyond that is the major issue, and I believe our UI team is trying to come out with the value on top of that. Basing on past experience, I’m sure they’ll do a great job and we’ll see others learning from us. That’s how it works here. Would you prefer vendors to ignore each others accomplishments or deny them?

I disagree with you on your perspective of mobile world. I, for one, wait for Fennec on Android and I know a lot of people who do. I’m excited to think of how we can fit Firefox experience into Windows Mobile 7 and I’m sure it’ll be an exciting journey. Mozilla Messaging is going to generate projects related to forms of communication and I find this topic to be extremely important, so I have no worry about sustainability of it. Our embedding story is nothing to be proud of, but maybe it was a trade-off we had to do in order to achieve what we aimed for. I share your concerns here, and I see many of the platform team people discuss what we can do in order to make it better.

I see Mozilla pretty much self-aware of many of the issues you raised and diversified internally enough to have people raising concerns internally and open enough to have a ground to talk about them – your post is a part of it.

Bottom line

But ultimately I believe your concerns would be all valid if that would be all that is happening in the Mozilla project. If the whole community would work on either Personas, or marketing or UI. But is it? Do you really feel that those elements you describe represent, as you wrote “Mozilla project as a whole“?

I see Mozilla as a meta-project that’s involved in a huge number of projects that touch amazing variety of issues, and it is very hard to nail it down to one or two and call that “representative” for the community.

No matter what you think of Personas, I don’t think you can say that this effort matches what Mozilla is doing with Drumbeat, Bespin, Raindrop or Weave. No matter if you find Jetpack valuable, I hope you did not get lured by press foretelling the end of extensions as we know them. Can you name an example of a project that generated tons of thousands of dependencies and was irresponsibly killed by Mozilla? Have we ever done something like that? Then, do you really think we will do this?

We generate amazing amount of projects of very different kinds. Globally, our community is very diversified and in different points of their journey. Some communities need more marketing, UK, Korea, Sweden? Some, like Italy, Poland, Germany, may have enough internal marketing to consider Mozilla global marketing effort focused on promoting Firefox useless for them or even “too much”. We, Mozillians who live in those countries should act as a membrane which adjusts the signal, and gives feedback to our fellow Mozillians worldwide about what we need, and what we don’t. Poland has 52% of market share, and we need things like developers community or foundation-like efforts to use the potential are trust we generated over years as a platform to bring Mozilla values further, so we work with Mitchell, Mozilla Foundation and from time to time I try to get Paul Rouget’s attention 😉 At the same time, PR and marketing wise, we work with Polish PR agency, Barbara and others to balance the amount of press we generate to avoid wasting time to convince the convinced ones. That’s just adjusting. I believe that we should do that much more often in many countries which just are ready for different aspects of Mozilla project to stimulate and energize Mozillians.

Example? Here you are. You think we focus too much on marketing sites? Well, then you focus on other aspects! I believe that the concept of “we have to localize all websites to all languages” is not sustainable anymore. We will generate more websites/webapps, and our local communities will decide which ones to promote locally. We don’t have to have everything localize everywhere and that’s a great power you have to adjust the signal to your locale. Mozilla should make sure all websites/webapps/apps are localizable and let community decide which ones to localize. Focus on the ones that are most important for you!

We have so many projects to pick from! Of different kinds, using different techniques to address different aspects of the common value set expressed in Mozilla Manifesto. They’re also diverse in a way you think about them.

Some of them are truly unique and experimental, and massive – think of our JIT approach (it took a ride from MtV to SF airport for Taras to explain to my what is so different in our JIT approach but now I’m proud of what we’re aiming for), think of L20n, think of Ubiquity,  Bespin, Raindrop or Drumbeat.

Some of them, are application of Mozilla-way onto existing concepts. Weave is not innovative because it allows sharing data. But it brought privacy to the picture. SUMO is not the very first support platform ever, but the way we approach the concept of support is innovative and “Mozillian”. Our Metrics team is not the only metrics team in the world, but they do hell a lot of innovation on making their work public and open to contribution which is pretty unique. We may not be the first project ever to have marketing team, but we approach marketing and PR in a unique and innovative way.

Some of them are just a catch-up game and that’s also not bad. We have 350 million users, if someone brought a good idea to the world of web browsers and we can just make sure that 350 million Internet users may use Internet safer, easier and better then I find it pretty important thing to do and I definitely expect such actions from other vendors. (think: partial upgrades)

Ultimately, many of them are a mix of the ones above and as long as we are able to generate new projects that resonate with what people find important on the Internet, I think Mozilla makes an impact and has a bright future that we, including you and me, have to shape.

Categories
main mozilla po polsku tech

Mozilla, wolność i h.264

Uwaga: poniższy tekst, to moja prywatna opinia, jako członka projektu Mozilla.

W tym tygodniu nastąpił ważny moment w historii rozwoju WWW. Youtube i Vimeo ogłosiły plany odejścia od technologii Flash na rzecz standardu HTML5.

Kawałek historii

Blisko rok temu, Mozilla ogłosiła wprowadzenie tagu <video/> i rozpoczęła promowanie go, jako alternatywy dla zamkniętych wtyczek.

Wiele osób wtedy krytykowało tę decyzję. Zwracano uwagę, że nikt inny tego nie wprowadza, że HTML5 to jeszcze nie jest standard, że Ogg/Theora, kodek, którego używamy, nie jest wystarczająco szybki oraz, że jest za późno, nikt nie zrezygnuje z zamkniętych wtyczek dla otwartego standardu. Ta krytyka nie była bezpodstawna. Wszystkie powyższe punkty były prawdziwe.

To, że dziś rozmawiamy o tym z tak odmiennego punktu widzenia pokazuje tylko jak szybko następują dziś zmiany w świecie standardów w porównaniu do, np. czasu jaki zajęło wprowadzanie standardu CSS2. To ogromny sukces całej społeczności skupionej wokół WWW i wierzę, że Mozilla miała w tym decydującą rolę.

Wracając do tematu, dziś mamy trzy ważne silniki obsługujące <video/> – Presto (Opera), Webkit (Safari, Chrome) i Gecko (Firefox, Camino, Seamonkey, Flock). Mamy rosnącą liczbę stron, które korzystają z tego standardu i rosnącą liczbę użytkowników, którzy korzystają z przeglądarki która je obsługuje. (W Polsce około 50% użytkowników Internetu).

Niestety, Chrome i Safari zdecydowały się wspierać wideo obsługując jedynie kodek o nazwie h.264, który jest zamknięty i trzeba za niego zapłacić. Mozilla uważa, że taki krok jest szkodliwy dla rozwoju Internetu i stoi w sprzeczności z Manifestem Mozilli i w efekcie nowe platformy Vimeo i Youtube nie mogą być wykorzystywane przez Firefoksa.

Wierzymy, że znajdziemy porozumienie, ale na razie sytuacja jest trudna. W tym poście postaram się wytłumaczyć, dlaczego Mozilla uznaje H.264 za zły kodek dla Internetu.