Net at Work GmbH is a product-based consultancy company with a glowing feather in its cap: NoSpamProxy . NoSpamProxy is a multi-faceted secure email gateway solution aimed at providing powerful anti-spam and anti-malware capabilities. Moreover, it provides a number of other functionalities such as email encryption, large file transmission, and email disclaimer management.
Background
The NoSpamProxy user interface was originally created using a WPF and XAML-based technology stack. As we have an international clientele, localization of the UI is of utmost importance to us. To localize WPF applications we had to translate all texts that were used in the C# code files as well as all texts that are displayed in the WPF UI. To achieve these goals we had to ensure that every piece of text used in the application ends up in the compiled assembly, along with a unique ID. To this end, all texts from the code files were placed into dedicated resource files of the project. Additionally, the updateuid from MSBuild , hooked into a pre-build step, ensures that each text in the UI has a unique ID too. Therefore, we can extract all properties of these texts from the compiled assembly and localize them as needed. An internal tool is then used to manage the translation. After all translations are done, the tool uses the original assemblies and the translations to build a new resource assembly for each locale.
Currently we are in the process of migrating part of the solution to the cloud. For this, we also need to migrate part of the UI to a web-based solution, which is required to be multi-lingual. However, we quickly realized that we were missing the whole ecosystem of resource files, along with automatic resource ID generation, extraction, and compilation. As a result, we decided to create our own tools.
Localization Made Easy: aurelia-i18n + Net at Work GmbH Toolchain
As we use Aurelia for the web app, we leverage the
aurelia-i18n plugin
greatly to facilitate the localization of the app.
Internally, the plugin uses
i18next
to provide the translation service.
In very simple terms, using the plugin is a two-step process.
Firstly, you need to mark the elements and attributes in the HTML templates that need to be translated, using the localization attributes such as t
, and a unique value for the attribute.
You then must initialize the i18next
instance with the localization resource, which in this case is a JSON object, where the key
is the value of the t
attribute, and the value
is the translation.
aurelia.use
.plugin("aurelia-i18n", (instance) => {
let aliases = ["t"];
// ...rest
return instance.setup({
attributes: aliases,
lng: "en",
resources: {
en: { translation: { key: "value EN", ...} },
de: { translation: { key: "value DE", ...} },
//... other languages
}
// ...rest
});
})
The code snippet above shows a very simplistic usage of the plugin, where the translation JSON objects are directly fed to the i18next
instance.
If you are using webpack, and supporting too many languages, the setup shown above can increase your bundle size unnecessarily.
In that case, you are better off using a backend service as shown in the official documentation of the plugin.
Such a service can also be helpful in pre-processing the resources, such as merging resources from different sources in runtime.
Once these resources are fed to the plugin, Aurelia will update the translations on the fly, whenever the active locale is changed by calling i18n.setLocale('...')
.
Let us now focus on the fact that to use aurelia-i18n
we need to have the resources marked with unique keys along with those JSON objects / files for every locale supported.
You may attempt to create the keys by hand.
However, that is an error-prone process as you have to ensure that keys are unique on a global level.
Moreover, creating and maintaining the files manually is a tedious job that gets exponentially harder with the number of locales supported increasing.
Therefore, the idea is to semi-automate the process as much as possible.
In order to do that, let us first understand the steps involved in the process, which are as follows:
- Generating the
i18n
keys for the HTML templates. - Extracting the keys, and values to a central external resource.
- Updating the central translation resource with the translations of the values for different languages.
- Compiling the translations to generate individual locale files for different languages that can be consumed by
i18next
.
We strongly believe that the quality of expert human translation is still unmatched. Therefore, apart from the third step, all the other steps can be automated by using the toolchain offered by our company. Using these packages, a gulp task can be composed easily to automate the process. Let us take a look at these tools below.
For the purpose of this example, assume that all of our HTML templates are under src
directory.
Additionally, JSON files can also be used as external resource files, which are distinguished in this example by the .r.json
extension (more on this later).
Using this information, first setup some constants which are used later by the tasks.
const path = require("path");
const src = path.resolve(__dirname, "src"),
json = path.resolve(src, "*.r.json"),
html = path.resolve(src, "*.html");
ID Generation
This is the first step in the process.
In this step, we identify and mark the text in the HTML templates that needs to be translated, using the
gulp-i18n-update-localization-ids
package.
This is done by using an array of whitelisted HTML tags as well as the attributes.
The package then generates the t
attributes (or the configured localization attributes) with unique values for the elements and the associated attributes as found in the HTML templates.
These values act as i18n
keys later.
const updateLocalizationIds = require('gulp-i18n-update-localization-ids');
const i18nGlobalPrefixes = new Map();
const generateI18nKeys = function () {
return gulp.src(html, { since: gulp.lastRun(generateI18nKeys) })
.pipe(updateLocalizationIds({
emit: 'onChangeOnly',
ignore: [{
content: v => v.startsWith('${') && v.endsWith('}')
}],
idTemplate: updateLocalizationIds.prefixFilename(i18nGlobalPrefixes),
whitelist: [
{
tagName: 'h2'
},
{
tagName: 'my-custom-el',
attrs: ['some-value']
}
]
}))
.pipe(gulp.dest(src));
}
A simplistic code snippet for generating the i18n
keys is shown above (for the full set of options please refer to the documentation) .
Let us assume that there are two HTML templates:
<!-- template1.html -->
<template>
<h2>some text</h2>
<!-- ... -->
</template>
<!-- template2.html -->
<template>
<my-custom-el some-value="value for the property in my-custom-el"></my-custom-el>
<!-- ... -->
</template>
The ID generation task transforms those templates into the following.
<!-- template1.html -->
<template>
<h2 t="template1.t0">some text</h2>
<!-- ... -->
</template>
<!-- template2.html -->
<template>
<my-custom-el some-value="value for the property in my-custom-el" t="[some-value]template2.t0"></my-custom-el>
<!-- ... -->
</template>
Resource Extraction
In the next step, all the keys and the corresponding values need to be extracted to a central JSON file. This file needs to be updated with the translation for different languages. Hypothetically, you need to deal with this one file when you translate. The JSON file also includes various metadata that can be used to track whether a resource has been changed after the last translation update.
Contextually, we want to emphasize that HTML files are not the sole source of translation resources. External JSON files can also be utilized to provide translation resources. This might look familiar if you are acquainted with the resource files in .NET. Some of the use cases for such files are as follows:
- Providing translation via code. For example, locale-specific validation failure message using withMessageKey
- Providing context-specific translations
- using
t="template2.r.Status" t-params.bind="{context: item.status}"
, - using interpolation
t="template2.r.Status.${item.Status}"
, or - by pluralization .
- using
We use a convention to mark such external resource JSON files with .r.json
extension (optional).
Naturally, we also need to translate those resource files for every supported locale.
Using gulp-i18n-extract the resources can be extracted from both the HTML templates and the JSON files. A minimalist code snippet is shown below.
const i18nExtract = require('gulp-i18n-extract');
// path to the central translation file
const translations = path.resolve(__dirname, "translations/i18n.json");
const i18nExtractOptions = {
// here we are saying that we want to extract resources from both HTML and JSON files
plugIns: [
new i18nExtract.html(),
new i18nExtract.json()
],
markUpdates: true,
defaultLanguages: ['de', "fr"] // <-- here goes the list of supported languages
};
const extractI18n = function () {
return gulp.src([html, json])
.pipe(i18nExtract.extract(translations, i18nExtractOptions))
.pipe(gulp.dest("."));
}
This task generates a file that looks similiar to the file shown below:
{
"template1": {
"content": {
"template1.t0": {
"content": "some text",
"lastModified": "2019-06-20T16:23:42.306Z",
"needsUpdate": true,
"translations": {
"de": {
"content": "",
"lastModified": ""
},
"fr": {
"content": "",
"lastModified": ""
}
}
},
},
"src": "src\\template1.html"
},
"template2": {
"content": {
"template2.t0": {
"content": "value for the property in my-custom-el",
"lastModified": "2019-06-20T16:23:42.316Z",
"needsUpdate": true,
"translations": {
"de": {
"content": "",
"lastModified": ""
},
"fr": {
"content": "",
"lastModified": ""
}
}
}
},
"src": "src\\template2.html"
},
"template2.r": {
"content": {
"template2.r.external": {
"content": "this is an external resource",
"lastModified": "2019-06-20T16:23:42.318Z",
"needsUpdate": true,
"translations": {
"de": {
"content": "",
"lastModified": ""
},
"fr": {
"content": "",
"lastModified": ""
}
}
}
},
"src": "src\\template2.r.json"
}
}
You can now update this file with the translation for various languages. This file needs to be put under source control to track the changes. It will later be used to generate the final translation files for every locale.
Compile
After the i18n.json
file has been updated with translations, the last step is to compile the file to produce individual locale files.
We do this using the
gulp-i18n-compile2
package.
const i18nCompile = require('gulp-i18n-compile2');
// destination directory for the locale files
const locales = path.resolve(__dirname, "locales");
const compileOptions = {
fileName: "translation.json", // <-- name of the file to generate
defaultLanguage: "en"
};
const compileI18n = function () {
return gulp.src(translations) // <-- this is the path to the 'i18n.json' generated in the previous step
.pipe(i18nCompile(compileOptions))
.pipe(gulp.dest(locales));
}
This task produces the following result:
.
+───locales
│ +───de
│ │ +───translation.json
│ │
│ ├───en
│ │ +───translation.json
. │
. └───fr
. +───translation.json
The content of these files looks as follows:
{
"template1": {
"t0": "some text",
},
"template2": {
"t0": "value for the property in my-custom-el",
"r": {
"external": "this is an external resource"
}
}
}
These files can be directly fed to the i18next
instance to provide the runtime translation resources.
There is no need to source control these files as these can be produced every time you build.
Putting It All Together
Using the tasks described above, you can compose a gulp task like below and use it in your CI/CD.
gulp.task("i18n", gulp.series(generateI18nKeys, extractI18n, compileI18n));
Summary
We at Net at Work GmbH consider localization to be an integral aspect of the user interface as it increases inclusion, broadens our client base, and helps to improve the general usability. Maintaining the localization correctly is a painstaking task, which is why feel pride in ourselves that we have contributed to make that task easier. As shown above, the packages can be easily used to compose a task to semi-automate the process of localization.
Thank you for reading. We hope this will be useful for you.