Review broken again?

All I can really do to help is make lists, so here’s an updated one. It’s mostly copy/pasted from my earlier posts, but I have confirmed that all of these issues are still present: (I’ll be crossing out entries as they appear to be fixed.)

1: Pressing escape or clicking ‘See answer’ after correctly answering a question in review treats the item as if it was answered incorrectly for the remainder of the review session. This was not the case before, and it makes it extremely inconvenient to access the extra information in that page.

2: The color-coded question mark icons shown for tests all seem to be the same purple color. I have been relying on these quite a bit to easily distinguish between the different testing directions in my courses, and have been making sure that they all match up for each of the courses that I create, so this is pretty disorienting.

3: When answering with the text in a column other than the intended answer, you are counted as wrong, rather than being prompted to ‘try again’. This compounds with issue #2 to make review sessions very frustrating for me, as I use three different testing directions in my courses.

4: Items are being introduced in learning sessions much more quickly than before. Previously, it would introduce them very gradually; introduce the first two, testing me a couple times on those, introduce the third, test me on that, along with the first two, before moving onto the fourth, etc. Now it’s almost pouring them all out at once. There are still tests between (most of) them, but nowhere near as many as before, and I’m even seeing many introduced back to back throughout the session. This makes it very difficult to keep up with the learning session for me, as I’m being given more items to juggle before I’ve properly grasped the previous ones, and I can safely say that it’s been having a negative impact on my recall for the items later on.

5: Points from speed and accuracy bonuses aren’t being applied to daily goals.

6: When using ‘preview’ in a level, the items seem to show in a random order, rather than from start to finish.

I removed the point on the change in how incorrect answers are addressed in review sessions, since @daniel.zohar stated that it’s an intended change. I’m honestly really unhappy with that change, but I guess that’s a discussion for another day. If anything else in this list is an intended change, it would help to know. Prior to @daniel.zohar’s recent post, there were many users pointing out the change to reviewing incorrect answers, thinking it was a glitch, but we were basically left in the dark on it being intentional.

I’ve probably missed some issues, so if anyone has anything that I should add, I can edit this post or make another if it gets buried.

2 Likes

Hi @daniel.zohar, are you saying that the algorithm has been changed so that there are less tests for a particular word after answering incorrectly? Are you working to change the algorithm to be more like the app? Staying with memrise for me revolves heavily around the website algorithm not being the same as the app so I’d really appreciate knowing this ahead of time vs. a month down the road when a large block of words seem to not have “stuck”.
Also, if it means anything, I’d heavily suggest asking power and heavy users to test new algorithms before implementing changes. Definitely the kind of thing that could be discussed as a lot of people rely on memrise.

4 Likes

Might I be petty and ask why all of these changes when I saw no complaints?

@TinyCaterpillar, looks like the courses now show on the bar when you’re doing sessions. Also, I’m taken back to the level rather than the course main page, so at least that’s been handled.

1 Like

Good to know. Thanks!

I thing this is a terrible downgrade of Memrise’s functionality. For several years, whenever I forgot a word, Memrise had me hold the word in my memory for a minute or so and then re-type it. This was very effective. Now, not so much. I thought this was a temporary bug. Now that I hear it is intentional, I wonder what the company is thinking? How does this change support learning?

5 Likes

This bothers me also. He’s mentioned an algorithm that gets more difficult the better you do and also mentioned what you quoted. I tried the app for a while when it first came out and noticed a big drop in my retention so I’ve always stuck with the website because I felt it kept the “classic” algorithm. I understand them wanting to improve it, but suddenly changing the algorithm is a pretty big deal that would be cool if it was discussed and/or worked through with the community.

1 Like

I’m learning French and have completed several courses. I use a PC and an android phone for Memrise.

On the PC during reviewing, if I enter the correct answer it repeatedly marks the answer as incorrect, even if I subsequently copy and paste their answer into the dialogue box.

As far as I can make out, when typing the answer into the difficult words dialogue box, the answer highlights as wrong one character after I have typed an apostrophe.

The problem only occurs on the courses written by Memrise. Also, occasionally I’m getting the Spanish character set or no character set at all which makes typing in French dificult.

I was not even using “escape” for that; i was pressing “see answer”, after the “next” was getting green. Very annoying, as annoying that now one sees only once the mistaken word.

I remember years ago there were somewhat functional dictionaries for some of the languages, I always wondered why course or language dictionaries weren’t more implemented with their memes. After swallowing that, the ability to access memes only during review was kind of a big deal. If the current state as only accessible through a mistake remains, then I feel like memes are being put on the back-burner even more. For me personally, sometimes reviewing memes is as important as reviewing words for words that are particularly difficult.

@Hydroptere Agreed, very annoying… :disappointed: I’ll update my post to point out both methods. Thanks!

@Wuxian I’ve also been getting the feeling that they’re still trying to phase mems out since the attempt at basically getting rid of them altogether. Losing the ability to import mems from the wiki was another big step in the wrong direction on that front. I really hope this one’s just a glitch… Prior to this issue, I checked my mems during review sessions all the time for things like kanji, to better cement the radicals in my head, along with various other information on the page not shown during tests. Now I’m simply unable to do so without making the reviews a massive chore.

audio review does not seem to function, it freezes after one item

Hi @Hydroptere Is this issue still happening? If yes can you send me a screenshot/course name/item so we can have a look into it.

Thank you TinyCaterpillar for the list. Number two and three on your list are big ones for me. Switching between phonetic, native language and character sets gets frustrating with out these two features

2: The color-coded question mark icons shown for tests all seem to be the same purple color. I have been relying on these quite a bit to easily distinguish between the different testing directions in my courses, and have been making sure that they all match up for each of the courses that I create, so this is pretty disorienting.

3: When answering with the text in a column other than the intended answer, you are counted as wrong, rather than being prompted to ‘try again’. This compounds with issue #2 to make review sessions very frustrating for me, as I use three different testing directions in my courses.

1 Like

in two Indonesian courses I could do it without problems; in these I could not even start it:

thanks

A new round of fixes has just been released:

  1. Answers from other columns are no longer marked as incorrect
  2. When previewing items, they are no longer shown in a random order
  3. Fixed problem when scrolling through images on image courses
  4. Speed and accuracy bonuses now count towards the daily goal

Thank you all for your help and patience!

1 Like

Audio review is still broken.

Thanks Geil, we’re looking into it. Are you experiencing this issue on all courses?

@BeaTrisy, I still am getting synonyms marked wrong. “Answers from other columns are no longer marked as incorrect”

Also, could you please actually comment on a couple of posts here so we’re not left in the dark? I truly do not understand how you take money from users but ignore the lion’s share of their comments.

Like from @TinyCaterpillar’s post,

"1: Pressing escape or clicking ‘See answer’ after correctly answering a question in review treats the item as if it was answered incorrectly for the remainder of the review session. This was not the case before, and it makes it extremely inconvenient to access the extra information in that page.

2: The color-coded question mark icons shown for tests all seem to be the same purple color. I have been relying on these quite a bit to easily distinguish between the different testing directions in my courses, and have been making sure that they all match up for each of the courses that I create, so this is pretty disorienting.

4: Items are being introduced in learning sessions much more quickly than before. Previously, it would introduce them very gradually; introduce the first two, testing me a couple times on those, introduce the third, test me on that, along with the first two, before moving onto the fourth, etc. Now it’s almost pouring them all out at once. There are still tests between (most of) them, but nowhere near as many as before, and I’m even seeing many introduced back to back throughout the session. This makes it very difficult to keep up with the learning session for me, as I’m being given more items to juggle before I’ve properly grasped the previous ones, and I can safely say that it’s been having a negative impact on my recall for the items later on."

Also, is what I and others have described, “Say I get the word “dog” wrong. Before, what would happen is if you continued to get it wrong, the review session would end. Now, you have to type “dog” correctly ONCE for the session to end. That means if you’re stuck on a twenty-word sentence, the session will never end unless you type the sentence perfectly. This was a change from 25 days ago when we started this thread.” A glitch or a change?

And what about what @Wuxian asked,

“Hi @daniel.zohar, are you saying that the algorithm has been changed so that there are less tests for a particular word after answering incorrectly? Are you working to change the algorithm to be more like the app? Staying with memrise for me revolves heavily around the website algorithm not being the same as the app so I’d really appreciate knowing this ahead of time vs. a month down the road when a large block of words seem to not have “stuck”.
Also, if it means anything, I’d heavily suggest asking power and heavy users to test new algorithms before implementing changes. Definitely the kind of thing that could be discussed as a lot of people rely on memrise.”

Just comment! How can you not see the frustration of your members when we’ve been having issues since the update on August 17, and the staff has barely responded to us?

Could you maybe have ONE DAY, JUST ONE DAY of the week devoted to answering user questions? Even just a, “We’re not considering that change at this time,” would be more appreciated than showing your indifference by ignoring us until your site completely blows up from an update two days ago.

2 Likes

Well 2. and 3. are definitely not fixed for me. 4. is indeed fixed now.

Thanks for the update, @BeaTrisy, but I just tested, and I’m afraid #1,2,&4 are still present on my end. I tried answering a kana test with the kanji (and vice versa) and was marked wrong, previewing a level still gave me a random order, and after finishing a review session that earned both speed and accuracy points, only the base points were added to the goal. I was never clear on the issue with #3, so I can’t comment on that. Please feel free to ask if there’s any additional info that would be helpful.