Multiple choice tests: better "wrong" alternatives to be shown while testing?

One thing that has been constantly making tests hard for me is the way Memrise determines/presents alternative answers that are shown in multiple choice tests.

The system seems to preferably find/show other possible (wrong) answers that are similar to the “right” answer.
To find these, Memrise probably uses some fuzzy search algorithm (soundex, DMP, Levenshtein, etc.) to order entries by similarity and then take the first 4-8 words and present them.

Also, Memrise does not consider the type of a word or phrase - it presents a random mixture of i. e. nouns, adjectives, even sentences, etc.
IMHO it would be much better to present items of equal (or similar, FWIW) type, at least where available (most larger courses have a column that specifies the type). IOW, Memrise should allow course creators to instruct Memrise to only pick answers from other database entries with the matching type.

Both above factors result in tests becoming very hard in large courses where authors need to add info to items in order to distinguish them from another, with items/answers like the following:

"your [not thy, thine, yours, of you]"

(Actually I would suggest that there is a “not this and that” column that would allow for a comma-separated list of such words.)
When each of the above words also has their own entry, along with all others on the “not”-list, guess what happens: yes, you’ll (most probably) be presented with exactly those words in MPC tests (at least on Android which is what I’m using most of the time).

Here’s a course where this is making it next to impossible to work through speed reviews:

(course forum: [Course Forum] 5000 most frequent Dutch words ♫ Audio)

@MemriseSupport, @JBorrego, @frabcus_memrise, @jamesmulholland: would there be any chance that you would consider improving your algorithm/database-queries?


supporting the request - what I do at the moment is suffer new words which could have several answers for a while as part of learning this actual word, and later on, when on maintenance-rotation, so to say, I add the other possible answers as Alts (instead of modifying the description as you have shown)


I’ve been saying this for a while - what is needed is an option to exclude similar words, like alternatives, but with an opposite action. Unfortunately, many suggestions are falling on deaf ears, just as we used to. By the way, official courses are also affected by this algorithm, there are many synonyms, too.

And yes, in many cases Memrise algorithm is useless. If you know how to distinguish different parts of speech by their endings, it does not matter if those words have the same beginning.


Good point. Have noticed that myself. Tried a lot. Still thinks my approach w/the additional info in brackets like in the Dutch course is okay-ish - like that better than having separate columns since I virtually always use my phone to learn. It’s really an algorithm issue and can’t be remediated by a course creator. Thus, I support bringing this up to the Memrise team. Thx.

1 Like

Hi all, thanks for all the comments.

Unfortunately we don’t have any plans to change this in the short-term, but I’ve shared this thread with the relevant team so they can read it.

Although I can’t promise if or when this will be worked on, please rest assured that all customer feedback is collected and taken into consideration for future iterations of the web product.

I hope this helps for now.



Hi @ale_c & @MemriseSupport,

I’ve been wondering how alternatives are generated, particularly when they are.

If I create a course level by level, does it create alternatives when the level is completed from those available at the time?

ie after level 1 there may be 15 alternatives and after level 2 there may be 30 alternatives available etc?

OR are they all generated only when the course is complete and published?
(this would mean there might be 100 to 200 to choose from.)

If the latter (only when published) then would unlisting it and saving, then re-publishing it regenerate options from the corrected words as alternatives and not the dreaded phantom old, incorrect entries?

If not, that might be a very simple procedure to implement, and put an end to all those requests for a reset button and attempts to eliminate them including using scripts (I don’t understand).



When are alternatives generated when I create a reversed level?
Presumably at the start as they are needed for the learning process.

1 Like

Jumping in on this request to support it and say I’d love to see a “tag” format of some sort - I do a couple of Kanji-learning courses where it’ll be one kanji to learn and 4-6 examples, so in a chapter there’ll be around 12 kanji where they’re obviously kanji options (with the answer formatted like “お・せつ” note the “・”) and then about 50 options where it’s obviously an example (with the answer formatted like “おる”). Sometimes I can guess the answer just by the fact there’s only one kanji/example answer, so simply being able to tag “kanji” or “example” and then Memrise groups all potential answers appropriately would significantly improve the learning experience.

1 Like