MTPE, or machine translation post-editing, is a current reality and trend in the localization industry. It allows for combining the speed of translating engines and human context understanding to provide quality results most shortly. The schema is simple: translators don't work "from scratch" - they have translated content by MT(machine translation), which should be proofread and edited. Thus, it appears to be a logical question - how can we count the translator's efforts during post-editing? How to understand if some text segments were edited or if they were good from the start? Fortunately, we know the answer.

The translators' efforts comparison with and without post-editing

To demonstrate the problem more clearly, let's use some text segments and imagine that they should be translated into German:

"I love cats."

"If you have any further questions..."


Working with a freelance translator, you most likely will have the rate per word, for example, 0.08$. The three segments above have 10 words So the final amount will be 0.08$.x 10 = 0.8$

In the case of post-editing, the translator will have the following data from the start:

I love cats.Ich liebe Katzen.
If you have any further questions,...Wenn Sie weitere Fragen haben,...
flat designflaches Design

So, the task for the translator will be to add the edits based on the context and tone of voice:

I love cats.Ich liebe Katzen.None, text correct from scratch.
If you have any further questions,...Wenn Sie weitere Fragen haben,...Wenn du weitere Fragen hast,...
(in case we need informal tone of voice)
flat designflaches DesignWohnungsgestaltung (in case the word "flat" was used
in the "apartment" meaning.)

Thus, the translator left the first segment as is, added minor edits to the second, and changed the last one entirely. There was no need to translate all the words, and the task was completed faster with less effort. How to calculate this?

Why and how to measure the MTPE efforts?

Calculating the translators' efforts allows clients to get fair and transparent pricing for the content translation. If the machine engine translated the text perfectly, the translator would only proofread and "approve" the ready text, which could lower the price. Let's see approaches of how this can be done:

HTER - the edit distance between the original translation and post-editing;

  1. Original translation - machine-translated text.
  2. Post-edited translation: text after human post-editing
  3. Calculate the edit distance between the original and post-edited translations using algorithms like the Levenshtein Distance.
  4. Interpretation: A higher edit distance implies more effort by the translator.

SPW - seconds per word. Shows the time spent on translation divided by the length of the post-edited translation; SPW=Total Number of Words/Total Time (in seconds)​

KEYS - total number of keystrokes that were pressed while post-editing.

Editing density: (Number of edited words/Total number of words)x100.

Quality Improvement: using BLEU or TER metrics to evaluate machine translation and post-edited quality.

Counting MPTE effort is an all-purpose thing that can be applied to different languages. For example, we can confidently say that the exact text MTPE between English and Spanish is much faster than between English and Ukrainian or German. The reason is that popular engines have the highest level of accuracy for the English-Spanish pair (Google Translate has over 90% accuracy.) If this topic interests you and you want to learn more, read our article about the most popular machine translation engines.

How does Lingohub help to measure the MTPE efforts?

As Lingohub provides a prefill option, which means you can fill empty text segments from other languages, translation memory, or machine translation, we thought about how to measure the translators' effort - EES.

EES or Edit Effort Score is a quantity of editing in translation that ranges from 0 to 100, where 0 is no edits - the text was approved with no effort (proofreading), and 100 is when the text was entirely changed. For example, if the sentence "Please create a note" was changed to "Please create a file," the EES will be calculated as 14, classifying it as a case of low effort. How do we suggest working with this 0-100 range for your comfort? We split all the gradation into four levels:

  • proofreading - the efforts score = 0
  • low effort, from 1 to a configurable figure
  • middle effort - fully configurable
  • high effort - from configurable figure to 100

Thus, you can set up the effort score for all the levels except proofreading.

EES settings

This EES information will be used for the cost reports, so the localization manager and translator will see how many words and with which effort were translated for some period.

translators efforts

Moreover, you can set a specific price per word based on the effort level - lowest for proofreading and ascending based on the effort level.

manage rates

Let's count the price difference between the constant price per world and effort score usage. As an example, we will use the text segments from the previous paragraph and the rates below (we use the same 0.08$ as a price for full translation (high effort).

  • Proofreading = 0.01$
  • Low effort = 0.02 $
  • Middle effort = 0.03$
  • High effort = 0.08$
EnglishGermanEffortsEffort levelPrice
I love cats.Ich liebe Katzen.None, text correct from scratchProof-
0.01$ x 3(words) = 0.03$
If you have any further questions,...Wenn Sie weitere Fragen haben,...Wenn du weitere Fragen hast,... (in case we need informal tone of voice)Low effort0.02 $ x 5 = 0.1$

So the final price for 3 segments (10 words translating) will be 0.03$ + 0.1$ + 0.08$ = 0.21$, which is more cost-effective than the previous example of $0.08 per word for 10 words, which was $0.80 in total.

If you want to get more information about how the EES and cost reports work, we highly recommend you to read our help center articles:


The machine translation engines entirely changed the "rule of the game," so the translation industry should consider the new reality and find a way to manage it successfully. Lingohub translation management service provides a simple but, at the same time, effective solution that allows users to work with translators' efforts transparently and fairly.

If you have any questions or want more information - schedule a quick demo call with our team or sign up and try all the Lingohub features for 14 days for free!

Try lingohub 14 days for free. No credit card. No catch. Cancel anytime