Memrise Release Notes - 28 Nov 2017 (Update on changes to typing test)

Thank you for the clear explanation. And thanks so much to @stream_nine90 for writing the script.

1 Like

is there any solution to remove the keyboard with a script (that turns one can turn off/on)? for the courses asking for strict typing… i cannot turn down ublock every time i need the special characters that the course creator provided (btw, to recap, in user provided courses, if the creator added the special characters for the respective language, then one does not get bothered with memrise’s “anagrams”)

the silence of the team is deafening

1 Like

Thank you so much for writing this.

(Posted this in the wrong memrise announcement, so repeating here.)

It seems there is a way for course creators to override the awful anagram/jumbled letters issue that is plaguing non-language courses (and language courses without special alphabets).

Under ‘edit course’, ‘edit column’, ‘testing’, ‘keyboard characters’, entering any characters will override the anagram-of-the-answer feature. For instance ‘a’, ‘thewholealphabet’ or just a full stop. Users can still of course type what they want and are not restricted to what is entered in this box.

I have now done this for my created courses - I hope the course creators do this for the ones I’m a consumer of.

A further workaround I have used for the other problem of auto-acceptance is to change my target words to begin with capital letters. Since most people won’t naturally type the capital letter, it doesn’t trigger the auto-accept when correct. ‘Mark typing strictly’ needs to be turned off, it should be said. Not appropriate for every course, I know - and a faff to implement.

This has saved memrise for me - at least for my own courses.


Thank you so much for this!!!

I had spent ages creating a course, only to realise after that the ‘auto accept’ left it close to being null and void.

I’ve spent near on half a day trying to find a solution!!

1 Like

That is such a beautifully simple way to solve the problem of ‘auto accept’!!

Aaaaa!!! :slight_smile:

Why not just make ‘auto accept’ an option?

My course a lot of the time tests the users knowledge of adjective and verb word forms for singular and plural; masculine and feminine; and first, second and third person. The different forms are usually very similar words.

My course is a Romanian language course for English speakers.

The singular masculine form for ‘good’ in Romanian is ‘bun’, the plural feminine form for ‘good’ is ‘bune’.

If the the learner is asked for the singular masculine form for ‘good’ in Romanian, and they think it is ‘bune’ when actually it is ‘bun’, they can still get it ‘incorrectly right’ (if you see what I mean). As soon as the learner types the ‘n’ in ‘bune’, the system gives the correct answer for ‘bun’, even though ‘bune’ was going to be written.

I believe the site works on science/psychology/neuroscience (whatever) based algorithms, so the ‘auto accept’ feature, when considering what I wrote above, seems to leave the science and algorithms behind the site null and void under these conditions


How about this - add a space at the end of words? Also less intrusive than the capital letters scheme.

But I agree of course - I wish they would add an option. In the meantime…


I would be fine with auto-accept if it was consistent.

Some answers auto-accept, some don’t. When they don’t, I wind up second-guessing myself, perhaps changing my answer, hit ‘enter’ and then see that my first answer was right and it just didn’t auto-accept for some random reason.

My rate of wrong answers massively increases with this feature.


In my experience, it’s probably a case of just omitting a punctuation mark or diacritic. Annoying, isn’t it?

Sometimes it’s not including a full stop or question mark, sometimes it’s having or not having a capital letter at the start, but sometimes it just doesn’t work at all.

1 Like

Yep! That’ll do it.

I haven’t noticed that…yet!

That’s interesting because mine has improved. If my answer doesn’t get auto-accepted, I usually have time to correct it, if it’s something simple like a missed punctuation mark, accent, initial capital, etc. That’s not always possible for long phrases, of course and it’s probably not the case if you are learning a language that, for example, doesn’t use the Roman alphabet.

1 Like

Nice update! Thanks

“guessing” does not help learning, Alan, maybe your mistakes rate is lower when memrising, but in real life…

Yes, agreed. Although, it wasn’t my intention to claim that I’m learning better as a result of ‘auto-accept’ or that I “guess” (!) my answers. :face_with_raised_eyebrow: It’s true, of course, that unless your first answer is correct in every respect, you haven’t learned that item fully…unless, perhaps, you happen to be a hopeless typist. :smile:

I am not a fan of ‘auto-accept’.

1 Like

Actually, making a mistake, struggling a bit to figure out the mistake, and then correcting the mistake, absolutely does support learning. Error-rethink-fix sequence is a fundamental human learning activity.

But, sure, it isn’t perfect. it does mess up the spaced recall, because an item you didn’t know should, in theory, come up again soon, and now it won’t. And it scores as correct any items that you thought had more letters on the end. And if you think of the review as a “summative test” to be scored, rather than a “learning activity” to further you along your educational path, then of course this change would be frustrating. But still…

Somehow, it feels like if I post anything that isn’t angry and bitter about this, I’m going to be slammed in the threads and personal messages. But in the languages I am learning, the courses I am learning, the recent changes have been an overall positive improvement.


That’s the big, big, big problem. Some weeks ago, I hadn’t so much time to work on my courses, and took them quickly on the app (with auto-accept: I didn’t know it was possible to turn it off on the app by then). When I get these words again now, many weeks later (as they were “known”, the repetition interval was very long), I see I don’t master them at all. I should have repeated them much more, but Memrise “thought” I knew them and didn’t need to repeat. So I did not learn these words correctly by then, and I have to use much time to learn them again now: this is all wrong.


I will make that move as well, even if my subscription is not finished.

Memrise seems to me like a good concept… very badly developped, probably with poor programmation, making every change too complicated.

That would explain their silence and that would explain the lack of conviviality and flexibility.

“Auto correct” is fixed - works now for offical Memrise 1-7 courses:

These force nowadays “strict-typing” when you use Cooljingle’s “all typing” user script, even they are configured (and were working well before the last update) as “non-strict-typing” for single words).

I hadn’t used Memrise for a few months but couldn’t believe the downgrade when I loaded it up this morning, and having found this thread I think it’s baffling that this has been going on for two months. The system remains awful for learning Japanese with IME and the web app. The keyboard reduces its effectiveness as a meaningful learning tool, auto accept is a needless crutch which further reduces the need for certainty and precision, the progression from one item to the next takes eons, and your previously entered text then remains in the next item’s entry field so you have to delete the previous answer for EVERY SINGLE ITEM.

Much as this is a new problem for me, I can’t believe that you would leave users waiting for close to two months with no meaningful fix for this. Good on the community for releasing some work-arounds, but I’m extremely disappointed in the dev team and very tempted to make the move over to a different application.