Categories
main mozilla tech

Multilingual Gecko – 2017-2018 – Rearchitecture

Between 2017 and 2018 we refactored a major component of the Gecko Platform – the intl/locale module. The main motivator was the vision of Multilingual Gecko which I documented in a blog post.

Firefox 65 brought the first major user-facing change that results from that refactor in form of Locale Selection. It’s a good time to look back at the scale of changes. This post is about the refactor of the whole module which enabled many of the changes that we were able to land in 2019 to Firefox.

All of that work led to the following architecture:

Intl module in Gecko as of Firefox 72
Yellow – C++, Red – JavaScript, Green – Rust

Evolution vs Revolution

It’s very rare in software engineering that projects of scale (a Gecko module with close to 500 000 lines of code) go through such a major transition. We’ve done it a couple times, with Stylo, WebRender etc., but each case is profound, unique and rare.

There are good reasons not to touch an old code, and there are good reasons against rewriting such a large pile of code.

There’s not a lot of accrued organizational knowledge about how to handle such transitions, what common pitfalls await, and how to handle them.

That makes it even more unique to realize how smooth this change was. We started 2017 with a vision of putting a modern localization system into Firefox, and a platform that required a substantial refactor to get there.

We spent 2017 moving thousands of lines of 20 years old code around, and replacing them with a modern external library – ICU. We designed a new unified locale management and regional preferences modules, together with new internationalization wrappers finishing the year by landing Fluent into Gecko.

To give you a rough idea on the scope of changes – only 10% of ./intl/locale directory remained the same between Firefox 51 and 65!

In 2018, we started building on top of it, witnessing a reduced amount of complex work on the lower level, and much higher velocity of higher level changes with three overarching themes:

  1. Migrate Firefox to Fluent
  2. Improve Firefox locale management and multilingual use cases
  3. Upstream all of what we can to be part of the Web Platform

We then ended up Q1 2019 with a Fluent 1.0 release, close to 50% of DTD strings migrated to Fluent, with all major new JavaScript ECMA402 APIs based on our Gecko work and what’s even more important, with proven runtime ability to select locales from the Preferences UI.

What’s amazing to me, is that we avoided any major architectural regression in this transition. We didn’t have to revisit, remodel, revert or otherwise rethink in any substantial way any of the new APIs! All of the work, as you can see above, was put into incremental updates, deprecation of old code, standardization and integration of the new. Everything we designed to handle Firefox UI that has been proposed for standardization has been accepted, in roughly that form, by our partners from ECMA TC39 committee, and no major assumption ended up being wrong.

I believe that the reason for that success is the slow pace we took (a year of time to refactor, a year to stabilize), support from the whole organization, good test coverage and some luck.

On December 26th 2016 I filed a bug titled “Unify locale selection across components and languages“. Inside it, you can find a long list of user scenarios which we dared to tackle and which, at the time, felt completely impossible to provide.

Two years later, we had the technical means to address the majority of the scenarios listed in that bug.

Three new centralized components played a crucial role in enabling that:

LocaleService

LocaleService was the first new API. At the time, Firefox locale management was decentralized. Multiple components would try to negotiate languages for their own needs – UI chrome API, character detection, layout etc.

They would sometimes consult nsILocaleService / nsILocale which were shaped after the POSIX model and allowed to retrieve the locale based on environmental, per-OS, variables.

There was also a set of user preferences such as general.useragent.locale and intl.locale.matchOS which some of the components took into account, and others ignored.

Finally, when a locale was selected, the UI would use OS specific calls to perform internationalization operations which depended on what locale data and what intl APIs were available in the platform.

The result was that our UI could easily become inconsistent, jumping between POSIX variables, user settings, and OS settings, leading to platform-specific bugs and users stuck in a weird mid-air state with their UI being half in one locale, and half in the other.

The role of the new LocaleService was to unify that selection, provide a singleton service which will manage locale selection, negotiation and interaction with platform-specific settings.

LocaleService was written in C++ (I didn’t know Rust at the time), and quickly started replacing all hooks around the platform. It brought four major concepts:

  • All APIs accepted and returned fallback lists, rather than a single locale
  • All APIs manage their state through runtime negotiation
  • Unify all language tags around Unicode Locale Identifier standard
  • Events were fired to notify users on locale changes

The result, combined with the introduction of IPC for LocaleService, led us in the direction of cleaner system that maintains its state and can be reasoned about.

In the process, we kept extending LocaleService to provide the lists of locales that we should have in Gecko, both getters and setters, while maintaining the single source of truth and event driven model for reacting to runtime locale changes.

That allowed us to make our codebase ready for, first locale selection, and then runtime locale switching, that Fluent was about to make possible.

Today LocaleService is very stable, with just 8 bugs open, no feature changes since September 2018, up-to-date documentation and a ~93% test coverage.

OSPreferences

With the centralization of internationalization API around our custom ICU/CLDR instance, we needed a new layer to handle interactions with the host environment. This layer has been carved out of the old nsILocaleService to facilitate learning user preferences set in Windows, MacOS, Gnome and Android.

It has been a fun ride with incremental improvements to handle OS-specific customizations and feed them into LocaleService and mozIntl, but we were able to accomplish the vast majority of the goals with minimum friction and now have a sane model to reason about and extend as we need.

Today OSPreferences is very stable, with just 3 bugs open, no feature changes since September 2018, up-to-date documentation and a ~93% test coverage.

mozIntl

With LocaleService and OSPreferences in place we had all the foundation we needed to negotiate a different set of locales and customize many of the intl formatters, but we didn’t have a way to separate what we expose to the Web from what we use internally.

We needed a wrapper that would allow us to use the JS Intl API, but with some customizations and extensions that are either not yet part of the web standard, or, due to fingerprinting, cannot be exposed to the Web.

We developed `mozIntl` to close that gap. It exposes some bits that are internal only, or too early for external adoption, but ready for the internal one.

mozIntl is pretty stable now, with a few open bugs, 96% test coverage and a lot of its functionality is now in the pipeline to become part of ECMA402. In the future we hope this will make mozIntl thinner.

2019

In 2019, we were able to focus on the migration of Firefox to Fluent. That required us to resolve a couple more roadblocks, like startup performance and caching, and allowed us to start looking into the flexibility that Fluent gives to revisit our distribution channels, build system models and enable Gecko based applications to handle multilingual input.

Gecko is, as of today, a powerful software platform with a great internationalization component leading the way in advancing Web Standards and fulfilling the Mozilla mission by making the Web more accessible and multilingual. It’s a great moment to celebrate this achievement and commit to maintain that status.

Categories
main mozilla tech

Pseudolocalization in Firefox

One of the core projects we did over 2017 was a major overhaul of the Localization and Internationalization layers in Gecko, and all throughout the first half of 2018 we were introducing Fluent into Firefox.

All of that work was “behind the scenes” and laid the foundation to enable us to bring higher level improvements in the future.

Today, I’m happy to announce that the first of those high-level features has just landed in Firefox Nightly!

Pseudolocalization?

Pseudolocalization is a technology allowing for testing the localizability of software UI. It allows developers to check how the UI they are working on will look like when translated, without having to wait for translations to become available.

It shortens the Test-Driven Development cycle and lowers the burden of creating localizable UI.

Here’s a demo of how it works:

How to turn it on?

At the moment, we don’t have any UI for this feature. You need to create a new preference called intl.l10n.pseudo and set its value to accented for a left-to-right, ~30% longer strategy, or bidi for a right-to-left strategy. (more documentation).

If you test the bidi strategy you also will likely want to switch another preference – intl.uidirection – to 1. This is because right now the directionality of text and layout are not connected. We will improve that in the future.

We’ll be looking into ways to expose this functionality in the UI, and if you have any ideas or suggestions for what you’d like to see, let’s talk!

Nitty-gritty details

Although the feature may seem simple to add, and the actual patch that adds it was less than 100 lines long, it took many years of prototyping and years of development to build the foundation layers to allow for it.

Many of the design principles of Project Fluent combined with the vision shaped by the L10n Drivers Team at Mozilla allowed for dynamic runtime locale switching and declarative UI localization bindings.

Thanks to all of that work, we don’t have to require special builds or increase the bundle size for this feature to work. It comes practically for free and we can extend and fine tune pseudolocalization strategies on fly.

Kudos

If that feature looks cool, in the esoteric way localization and internationalization can, please, make sure to high-five the people who put a lot of work to get this done: Staś Małolepszy, Axel Hecht, Francesco Lodolo, Jeff Beatty and Dave Townsend.

More features are coming! Stay tuned.

Categories
main mozilla tech

Multilingual Gecko Status Update 2018.1

As promised in my previous post, I’d like to do a better job at delivering status updates on Internationalization and Localization technologies at Gecko at shorter intervals than once per year.

In the previous post we covered recent history up to Firefox 58 which got released in January 2018. Since then we finished and shipped Firefox 59 and also finished all major work on Firefox 60, so this post will cover the two.

Firefox 59 (March)

Firefox 58 shipped with a single string localized using Fluent. In 59 we made the next step and migrated 5 strings from an old localization system to use Fluent. This allowed us to test all of our migration code to ensure that as we port Firefox to the new API we preserve all the hard work of hundreds of localizers.

In 59 we also made several improvements to performance of how Fluent operates, all of that while waiting for the stylo-chrome to land in Firefox 60.

In LocaleService, we made another important switch. We replaced old general.useragent.locale and intl.locale.matchOS prefs with a single new pref intl.locale.requested.

This change accomplished several goals:

  • The name represents the function better. Previously it was pretty confusing for people as to why Gecko doesn’t react immediately when they set the pref. Now it is more clear that this is just a requested locale, and there’s some language negotiation that, depending on available locales, will switch to it or not.
  • The new pref is optional. Since by default it matches the defaultLocale, we can now skip it and just treat its non-presence as the default mode in which we follow the default locale. That allowed us to remove some code.
  • The new pref allows us to store locale lists. The new pref is meant to store a comma separated list of requested locales like "fr-CA, fr-FR, en-US", in line with our model of handling locale lists, rather than single locales.
  • If the pref is defined and the value is empty, we’ll look into OS for the locale to use, making it a replacement for the matchOS pref.

This is important particularly because it took us very long time to unify all uses of the pref, remove it from all around the code and finally be able to switch to the new one which should serve us much better.

Next come a usual set of updates, including update to ICU 60 by Andre, and cleanups by Makoto Kato – we’re riding the wave of removing old APIs and unifying code around ICU and encoding_rs.

Lastly, as we start looking more at aligning our language resources with CLDR, Francesco started sorting out our plural rules differences and language and region names. This is the first step on the path to upstream our work to CLDR and reuse it in place of our custom data.

Notable changes [my work] [intl]:

Firefox 60 (May)

Although Firefox 60 has not yet been released as of today, the major work cycle on it has finished, and it is currently in the beta channel for stabilization.

In it, we’ve completed another milestone for Fluent migrating not just a couple, but over 100 strings in Firefox Preferences from the old API. This marks the first release where Fluent is actually used to localize a visible portion of Firefox UI!

As part of that work, we pushed our first significant update of Fluent in Gecko, and landed a special chrome-only API to get Fluent’s performance on par with the old system.

With an increase in the use of Fluent, we also covered it with Mozilla eslint rules, improved missing strings reporting, and wrote an Introduction to Fluent for Firefox Developers.

On the Locale Management side, we separated out mozilla::intl::Locale class and further aligned it with BCP47.

But the big change here is the switch of the source of available locales from the old ChromeRegistry to L10nRegistry.

This is the last symbolic hand-over from the old model to the new, meaning that from that moment the locales registered in L10nRegistry will be used to negotiate language selection for Firefox, and ChromeRegistry becomes a regular customer rather than a provider of the language selection.

We’re very close to finalize the LocaleService model after over a year of refactoring Gecko!

Regular healthy number of cleanups happened as well. Henri switched more code to use encoding_rs, and updated encoding_rs to 0.7.2, Jeff Walden performed a major cleanup of our SpiderMonkey Intl source code, Andre added caches for many Intl APIs to speed them up and Axel updated compare-locales to 2.7,

We also encountered two interesting bugs – Andre dove deep into ICU to fix `Intl.NumberFormat` breaking on roundings in Persian, and I had to disable some of our bidirectionality features in Fluent due to a bug in Windows API.

Notable changes [my work] [intl]:

Summary

With all that work in, we’re slowly settling down the new APIs and finalizing the cleanups and the bulk of work now is going directly into switching away from DTD and .properties to Fluent.

As Firefox 60 is getting ready for its stable release, we’re accelerating the migration of Preferences to Fluent hoping to accomplish it in 61 or 62 release. Once that is completed, we’ll evaluate the experience and make recommendations for the rest of Firefox.

Stay tuned!

Categories
main mozilla tech

All Hands On Deck – How you can use your skills to contribute to Firefox 57 success

Firefox 57 in making

If you’ve been following Firefox development over the last year, you probably know that we’re hard at work on a major refactor of the browser, codenamed Quantum.

It’s been a very exciting and challenging time with hundreds of engineers bringing to life new concepts and incorporating them into our engineGecko. Those refactors, which will culminate in the release of Firefox 57, touch the very foundation of our engine and require massive changes to it.

We’ve moved the whole engine from a single to multi-process model which is perhaps the hardest possible refactor a major piece of software can go through. We’ve added a whole new systems level programming language, and we’re replacing major components of the engine such as styling and rendering systems. There are substantial changes to our networking layer and security layers as well as completely new components like a new extension ecosystem, virtual reality stack and APZ. We’ve added another new programming language as part of our JS engine and significantly remodeled our build process and infrastructure allowing for quick feature testing, telemetry analysis to understand how our changes impact users and many, many more things.

The changes are massive. This is the largest refactor to Gecko and Firefox in my 17 years of contributing to the project. But we’re not done yet…

Categories
main mozilla tech

Localization framework changes in Firefox OS 2.0 and plans for 2.1

On Monday we branched Firefox OS 2.0 which is the first branch to contain a new localization library that has been developed by my team.

What landed for 2.0

Library landing and reactions

The library itself landed exactly two months ago. In order to avoid any potential regressions, we’ve put a lot of work to ensure that it matches the behavior of the old code that it replaced. I believe we can now claim a success, because with two months of baking on master we didn’t get any serious regressions that would require us to change anything in our code.

The new library comes with a lot of unit tests and is stricter than the old code, so we had to fix a couple of small bugs where code has been passing an object instead of a string to our API and one related to one test failing on old machine with too little memory. That’s been simple to catch and fix.

We also got a few requests to improve the console log output and error output that the library produces in order to simplify developers work.

New features

Pseudo-locales

The major new thing that Stas completed in this cycle is the support for pseudo-locales. While this could be done with the old code, it was significantly easier with the new code thanks to some architectural decisions like separation of buildtime and runtime code.

Pseudo-locales allow developers to evaluate their UI’s localizability against an artificially generated english-like locale to catch any hardcoded strings. It also generates right-to-left locale for testing purposes. Before that, we’ve been relying on often outdated localizations that we kept with our source code. Now we can always test against fully generated pseudo localization.

mozL10n.once wrapper

Another new feature is the introduction of mozL10n.once wrapper. We identified that a lot of Gaia apps are waiting for localization to be ready before they initialize themselves. That makes sense since a lot of those apps want to work with UI and localized strings, but the challenge in asynchronous world is that you never know if your code has been fired after or before mozL10n is ready.

Because of that, simply setting an event listener and waiting for window.onlocalized event is not enough (what if it already has fired before your code was launched?). Developers were using mozL10n.ready wrapper, but the problem with that is that is has been designed to re-fire on reach retranslation which meant that your init code has been fired every time user changed language. That’s not an intended behavior, but admittedly a rare one. What’s worse is that we retained all the init code in memory.

Now, with mozL10n.once we can safely initialize code when l10n resources are available and free the memory right after that.

Uses of l10n API

On top of adding new features, we’ve been mostly busy investigating and improving how the default Firefox OS apps interact with localization API. That led to multiple design decisions including introduction of the described above mozL10n.once.

Once we got the new wrapper, we started analyzing bootstrapping code of each and every Gaia app and updating it to use the proper L10n API. Twenty two fixed bugs later, we’re done!

It’s incredible how much we were able to accomplish in just two short months. We feel much better right now about the bootstrap process and we have a clear picture on what we want to do next.

Use new navigator.languages API

Thank’s to Mounir’s work on navigator.languages API and implementation we were able to remove the only Mozilla specific API in mozL10n. That means that mozL10n should work in any modern browser now!

What we’re working on for 2.1

2.1 will be as ambitious for us as 2.0 was. The hashtag of the work is still #cleanup, but now it’s much more about modifying our API so that it’s more transparent to developers and requires less manual code in their apps.

entity.attributes become node attributes

First thing we were able to land is the transition away from assigning l10n entity attributes as node properties. We cleaned up the hacks that have been used and switched to entity attributes as node attributes.

DOM overlays

Next, we have one leftover from the previous change and that is an infamous innerHTML. We currently don’t have any clear way to inject localizable DOMFragment in Gaia. Fortunately, we have one that fits perfectly in L20n. It’s called DOM Overlays, and we’re working on getting them into mozL10n. That will allow us to further secure L10n API and remove any innerHTML calls.

Mutation Observers

Majority of localizability code in Gaia apps is related to localization of DOM nodes. With Mutation Observers we will be able to significantly reduce the amount of manual calls to mozL10n API and make majority of calls be just about settings data-l10n-id attribute with Mutation Observers doing the rest.

Not only will it reduce the use cases of mozL10n.translate and mozL10n.localize, but I expect to be able to cut by over 50% the number of manual mozL10n.get calls and manual operations that are currently used to set the result of that call into the node.

Mutation Observers will simplify Gaia code, reduce the amount of bugs related to language switching and get us closer to runtime l10n API that we want to offer for Gecko.

Bootstrap

There are still some interesting edge cases around how code boostrapping relies on particular pieces of environment. Does your code need DOM to be interactive? Does it need l10n resources to be loaded? Or maybe you need DOM to be localized? All those events happen asynchronously and we currently do not have a clean way to guard against any combination of those that your init code may require.

We’re working on a bootstrapping wrapper that will allow app developers to simply define which pieces of environment should their initialization be blocked by.

That will further secure the app boostrapping process and limit the risk of condition races.

mozDOMLocalized

One part of the bootstrap puzzle is how we fire event when certain bootstrap things happen. Right now we fire window.onlocalized event that means that mozL10n resources have been loaded, but doesn’t tell us anything about if document’s DOM has been localized (and is ready to be displayed).

With the work on the new event, we’ll be able to remove the global one, and settle on triggering on event on document and one on mozL10n object. Did I tell you that we’re still cleaning up? 🙂

Move mozL10n to document

We originally placed mozL10n object as a property on navigator object. Because our API is transitioning to be per-document, it becomes an inconsistency and an obstacle to keep mozL10n API on navigator object. It hardly fits the world with iframes, ShadowDOM and HTML Templates. We’re going to move it to document.mozL10n.

Remove inline l10n

One of the built time optimizations that we have in Gaia is called inline l10n. We store some portion of the l10n resources within each HTML file in order to localize the UI before it is displayed. It’s not very scalable, costs us memory and performance, but historically helped us prevent flashes of untranslated content. We hope to be able to remove this optimization in this cycle which will significantly simplify our internal code and give us some small memory and performance wins.

Is this L20n yet?

While we introduce L20n concepts into mozL10n, we’re still pretty far from being able to say that we support L20n API in Gaia. There’s a lot of work to do and it’s going to be a challenging work as we port L20n concepts to Gaia, and merge lessons we learned while working on Gaia into L20n spec and implementation. What we hope to end up with is a single codebase used in Gaia and offered to web developers.

It’s an exciting journey and I’m so happy to make Firefox OS’s localizability the most modern among all OSes!

Categories
main mozilla tech

Thunderbird to become the default in Ubuntu!

Great news from Budapest where Ubuntu project is holding their version of our All Hands 😉

According to Michael Larabel Thunderbird will become the default choice for Ubuntu’s mail/newsgroup client! That’s a strong prove of progress that Thunderbird has made, and trust me, Ubuntu does not swap the defaults that easily.

Well, it now seems that we’ll have two of our products used as pretty important defaults in this most popular Linux distribution – Firefox and Thunderbird! (although we do have a strong competition for the browser slot).

Congratulations to the team working on Thunderbird!

Categories
main mozilla po polsku tech

Mini kampania z okazji wydania Firefoksa 4

Wraz z nadchodzącym wydaniem Firefoksa 4 Mozilla planuje małą kampanię promującą HTML5&Friends.

Aviary.pl szuka osób chętnych do pomocy w tłumaczeniu tej strony. Oferujemy dostęp do serwera testowego, niewielką ilość stringów do przetłumaczenia i sporą dawkę zasłużonego poczucia dobrze wykonanej roboty.

Zainteresowanych zapraszam do mailowania do mnie (zbraniecki at aviary dot pl) lub od razu do wejścia na Verbatim –  – zakładasz konto i dodajesz sugestie.

Uwaga, wysoka jakość tłumaczenia może skutkować niemoralnymi ofertami dalszej współpracy ze strony Aviary.pl 😉

Categories
main mozilla tech

Hard Blockers Counter 1.0

A little bit more of coding, a few bug reports later, HBC is ready for its prime time. Version 1.0 fixes the nasty toolbar height problem, it gives a user an indication of the interval covered by his plot and is just overall better.

It can be downloaded from an addons.mozilla.org listing, and the source code is available at builder.addons.mozilla.org. 🙂

A few of the lessons learned and thoughts:

  • builder is awesome, but it needs more real life users. A lot of bugs are only reproduced after you write your extension for some time, hundreds of revisions etc.
  • AddonSDK is excellent for this kind of extensions. It has everything you may want and makes the whole code extremely clean and simple to write and maintain. Just look at it – about 50 lines of core code – your cat could read that.
  • AddonSDK needs more real life users. Like with the builder, bugs show up only when you really use the extension you created.
  • AMO is an excellent developer friendly platform – it gives me a lot of satisfcators in a form of stats, and ability to manage my extension release process.
  • AMO-builder bindings need more real life users. I felt like I’m the first to try to push builder based extension to AMO – many trivial bugs that can be only revealed if you try to go through the whole thing.
  • AMO’s review/release process is excellent for the extension of the Old Days. It gives us a pool of high quality, verified extensions, that are easy to find and safe to use. It does not work with agile development. Builder and AddonSDK makes creating ad-hoc extensions super simple and quick (literally, 2 hours and you’re done with the first version, every new version is about 15 minutes of work). When you then push it to AMO it feels like Matrix slow motion then – you suddenly wait days for a preliminary review, not to mention almost two weeks you have to wait for a full review. My last revised version is super old comparing to what I claim to be the “stable” one now 🙁

This issue requires a little bit of description. I do not try to say here, that what AMO reviewers are doing is wrong – quite the opposite, I believe the whole process is excellent and anything that is exposed to the millions of users should get some time to season and be tested and be reviewed. It’s just that AddonSDK/Builder gives you a totally different setup that fits different needs. I believe AMO will need a workflow for extensions that are created in 10 minutes, distributed in 20 minutes, updated 5 times during 4 hours and are becoming useless after one or two days.

Think of a conference where schedule is updated often and people have hard time to track it. Using AddonSDK/Builder you can create an extension for it in literally 20 minutes (xhr, panel, widget). AMO is excellent for distributing it, updating your users etc., but it requires very different approach than say, AdBlock or Firebug. Then, you add a feature (ability to mark the talk you want to attend and get a notification when its room/time changes) and upload a new version 15 minutes later. You want to switch all your users to the new one now. Then you fix a small bug affecting linux users, and update users once again.

It’s amazing that Firefox is becoming a platform where it is possible, and I can’t wait for such application for AMO 🙂

  • AddonSDK requires a lot of users with their use cases. Myk’s approach is to iterate often which means to get version 1 ASAP and then add new features for version 2 instead of trying to build an ultimate solution without a release. I love this approach and it serves AddonSDK well, but now we need version 2 of many of the packages there – it can only be done if people start using the packages for a real life extensions and report what they miss. Like – Widget content cannot be easily themed. Or, you cannot control Panel’s scrollbar appearance. Jetpack team cannot plan for those use cases, they have to come from jetpack users. So be brave! Try things, report everything you need! 🙂
  • There is a group of at least 500 people who deeply care about our release process. They’re ready to increase the amount of items on their screen to have a continuous updates on our progress toward Firefox 4. And it’s been just two days. It can motivate people to help, make them feel the sense of progress, help them understand the challenges better. It sucks outsiders closer to the inner circle. I believe we can do much more and the nightly users, mozilla planet readers and the audience of my extension deserve the chance to get more info which can help them start contributing! 🙂

Categories
main mozilla tech

Hard Blockers Counter 0.8 (0.9) – with a plot!

Tremendous success of the Hard Blockers Counter extension (it got over 300 daily users and 4 positive reviews – yet they were lost in the transition :() motivated me to spend another too-early-morning on adding a history plot for the counter.

It was a real pleasure and a good lesson to go through – simpleStorage, timer, panel, and contentScript communication stuff. I reported some bugs in the toolkit, in the builder and in the builder->amo release process. It’s all been fixed 😀
The only real unresolved issues is that the widget affects the height of the toolbar. Reported as bug 626728.

So, 0.8 is ready and waiting, enjoy and fight that bug number down! 🙂

Source code as always available.

[update] Actually, Builder provided me an old version of my jetpack and I submitted it to AMO as 0.8. So now I added 0.9, but it has to wait to get a review – so if you want colorful plot, download it directly from here.

Categories
main mozilla po polsku tech

Wywiad O Fx4 i nie tylko

Aż trudno uwierzyć, że nie pisałem tu od 2 miesięcy… Zagrzebałem się w projekt L20n tak głęboko, że nie znajdywałem w sobie motywacji do blogowania. :/ Obiecałem sobie w tym roku to zmieniać, bo wierzę, że komunikowanie tego co się robi pomaga całej społeczności – zatem spodziewajcie się więcej i krzycie jak będę lagował 😉

Na początek roku podrzucam wywiad który ukazał się na firefox.bajo.pl – miłej lektury i czekam na opinie 🙂