Ambiguous questions in speed reviews

In the French course I am currently going through, there are a number of word pairs/triplets where the English translations are literally the same, e.g. ‘line’ (la ligne, le trait), ‘success’ (le succès, la réussite), ‘real’ (réel, véritable) and so on. Not much of a problem in a Classic Review - you just have to try them all until one matches.
Now the Speed Review algorithm seems to be specifically designed to make things harder for you by making you choose the answer from among similar-looking or otherwise related words, which is great, but the downside is that, more often than not, when asking you to translate e.g. ‘line’, it would present you with BOTH ‘la ligne’ and ‘le trait’ – they are both correct answers, but only one of them is accepted as such by the system. So I was wondering if it would be possible to adjust the algorithm so that these things wouldn’t happen?

Which course? Link will be very helpful.

You might try to search for ‘Course forum’ + course title, if you haven’t done so already - perhaps there is already thread specifically for that course.

This course: https://www.memrise.com/course/1189/intermediate-french/

But the way I see it, this problem isn’t really course-specific. Who knows how many ambiguities like that are out there - it might be very difficult to fix them all.

Speed Review, on the other hand, could be fixed in at least two ways: if a word has two possible translations, then

  1. either make sure they don’t both appear as choices within the same question
  2. or accept either one as correct

@NikolaiYourin

Hi Nikolai,

this makes me wonder why you are interested in speed reviews.

Do you use the web portal or the Android app?

The Speed review on the web portal really has a very seldom behaviour incl. increasing red bar approach vs reset and I do not really like it in the way it was programmed.

If you just want to 100% rely on multiple choice and being able to clear your backlog queue:

Try this Tampermonkey user script from Cooljingle: https://github.com/cooljingle/memrise-all-multiple-choice

You are right: there are a lot out there. I too would like Memrise to recognize ambiguity and remove it.

However, Memrise works this way at the moment, that it is the responsibility of the course creator to remove any ambiguities (and that is why I asked for the link). Sometimes, that is is a good thing (when it forces the meaning/definition to be made more clear), sometimes not (when they are near 100% synonyms).

If a creator makes this:
English - Description
boat - a vessel to go on water
ship - a vessel to go on water
As of now Memrise doesn’t detect that the definitions are the same and might give both as an alternative in the same question. I agree with your point 2, that Memrise should detect this and present only one. Why don’t you make it a feature request (if there isn’t?)?

There are two solutions as of now:

  1. adding boat as an alternative to ship and v.v. (which you cannot do with bulk-upload)
  2. adding and extra word to the description (which can be bulk uploaded), e.g.
    boat - a vessel to go on water (large)
    ship - a vessel to go on water (small)
    This option I like the most, especially when there is a noticeable in meaning.
2 Likes

@Thomas.Heiss

Hi Thomas,
Speed Reviews are not perfect, but they are one of the only two types of exercise available to non-pro users, so why not make them better. They’re useful for when you’ve just started a course, learned a lot of words at once and want to put yourself through some extra drill to remember them better.
They are however a bit too easy, in the same way most tests with multiple choices are.
As for your other question, I’m using the web portal (wouldn’t want to even think about entering all those accented letters without a proper keyboard).

By the way, this issue does not only affect speed reviews. This morning, I got:

maintenir = to maintain
entretenir = to maintain

one right after the other, within the same learning session, and guess what happened next… right, not just one, but several multiple choice questions containing both of them.

Hi John,
Yes, that looks tidy and makes sense, but some (usually very simple) words are impossible to disambiguate like that. I’ve seen some other attempts at disambiguation, like:

proche = close (not ‘près’)
près = close (not ‘proche’)

but those were quite ugly, and imagine doing that with a word that has four translations. Also, altering existing courses that have thousands of subscribers might not be a good idea.

So, fixing the site’s back-end might be the only realistic way around this issue. The algorithm behind those multiple choice questions is already quite complex, but that means that implementing either of those simple fixes should be a no-brainer. Not sure about making a feature request though, because to me this behaviour seems more like a bug than anything else.

(sorry for such a long post)

Usually I do not use “full typing mode” for the first six learning (flower planting) steps so with multiple-choice or listening exercise types I have the same problem.

For reviewing mode:

We usually have the same problem here as words often do not use alternative definitions in a course.
We are here not on Duolingo were a word or sentence often has MULTIPLE accepted alternatives in the database.

Without ALL provided alternatives/synonyms by the course creator, the multiple-choice algorithm has no choice in selecting unique words.

So this is what could be improved, either for the algorithm if it does not do it already or by the course owner/contributor by adding appropriate synonyms and cross-links if they might be missing.

My (review) workaround:

  • I am not turning off the standard “Auto accept” (which is by the way totally broken for multiple alternative definitions; you need to hit the ENTER hotkey manually!)

  • I avoid multiple-choice exercises (with the exception of the learning phase) by using Cooljingle’s “all typing” Tampermonkey user script on the web portal

  • when MartinPen’s user-created PT BR course prompts for 2-3 unique words (with missing hints) within a course or a 100 words review session with missing synonyms I can somehow brute-force for REVIEWS with “Auto accept” and “Auto correct” turned ON which one of the 2-3 is the correct expected word for the empty textfield and the current question.
    If I type the correct one or change it, it usually is accepted on the 2nd-3rd try and the text field turns green and Memrise jumps to the next question.

  • “all typing” works a bit better with added synonyms than without with the limitation of the broken Memrise “Auto accept” code

  • I usually do not use the “Memrise Turbo” script as a replacement for Memrise’s own default “Auto accept” with a maybe better alternative /synonym code handling (which I would first have to check / test, etc.) because this script is not in-sync with audio:
    It simply does not make sense to use it for the official Memrise PT BR 1-7 + PT Basic courses with longer phrases/sentences.
    IMHO you really can only use the Turbo script with courses which only have single words as I do not care if the audio finally plays on the next page(s) when I am 2-3 questions ahead with typing.

  • I would love to see the default “Auto accept” code bug fixed by Memrise staff or re-engineered and improved by a user script god for appropriately (auto-)accepting alternatives/synonyms and a really working auto-jumping mode to the next question when my answer is in fact right as I do not want to speed cycle thru (difficult) questions.
    I want to have the question-answer in-sync with the audio play when my answer is accepted (I do not care for the waiting time).