Review broken again?

Yeah, I got the see answer/escape to finally work by turning off those scripts. Glad that’s been resolved…

Hello all,

Thank you so much for your feedback over the last few weeks. This will hopefully be the last update, so let’s get to it:

  • Pressing Escape/“See answer” during a test should no longer be count the answer as wrong
  • We’ve tweaked the learning algorithm to introduce words more gradually in sessions.

Thank you again for your patience and Happy Learning!

I think this explains why the planting sessions doesn’t work anymore :frowning: Learning / planting session order is broken! @BeaTrisy

Hi @11145, When have you tried a learning session last?

@BeaTrisy About half an hour.

@BeaTrisy, there is still a bug that I just tested for that seems to still be present.

The bug is that currently if you type a correct answer that’s in a column other than the one that is being asking for, a little message pops up saying that you typed a correct answer but for a different column than was asked for, but if you type a correct alternate answer for another column than the one that was asked for, it’s just counted as wrong.

For example:

Prompt: "个"
Correct answer: "a"
Pinyin: “ge4”. Also, “ge5” as an alternate answer.

If one types “ge4”, one gets a nice little message saying that they typed the wrong column. If one types “ge5”, it’s marked as incorrect.

1 Like

@BeaTrisy For quite a while now I am experiencing troubles with reviewing from the dashboard. After about 20 phrases reviewed I get a message saying “Ups! I could not load the session” and from there on I can only review items directly in the courses, which is very tedious with 20+ courses learned in a single language. Logging in and out does not help, neither does changing browsers. The next day I can again review about 20 items from the dashboard, then the bug sets in again. I already submitted the bug in a form that I was told sends it directly to your developers and Q&A-team, but never got a reaction.

1 Like

@BeaTrisy I’m still not seeing different colors for the question mark graphic in review. Again this is really helpful for review sessions in Chinese. To remind, it used to be that when the response should be in English the question mark would be green, answers in Hanzi purple, answers in Pinyin Yellow, questions from a second course in blue… etc. I had gotten used to this and now find myself making a lot of stupid mistakes. It’s a little frustrating. Are you guys still planning on putting this back in?

Thanks for all your work!

1 Like

@BeaTrisy, turning off audio reviewin reviewing, and in learning, does not function anymore

many thanks for your assistance

Hi, @BeaTrisy, @daniel.zohar. Could we please hear back on the color-coded question mark icon issue? It’s still displaying purple on all items for me, which takes away a very helpful way of quickly determining which column I’m being tested on.

I was planning on beginning to work on building my next course today, but I’m unable to even start with this issue floating around. I’ve always been careful to ensure that all of my courses use the same color-coding, and that’s something I need to sort out when I first create the databases, since there was never a way to manually change it later. If it’s a deliberate change, could you please at least let us know?

2 Likes

Hi @TinyCaterpillar,
I’m afraid we’re not planning to fix this any time soon.
The UI of the learning sessions is going to go through a few iterations in the next few months and it’s not at all guaranteed that icon will even remain.

Is the problem in typing tests? You do have a hint telling you what it is that you need to type, do you not?

Thanks
Daniel

I really hope you don’t remove the colour-coding!!

The only hint in the question text is a word in bold as part of an overall sentence. This means that, when reviewing, you have to go through a two stage thought process (this is for Japanese but applies more generally):

  1. What is Japanese for [word]?
  2. Do I have to give the kana or kanji in my answer?

When the icons were colour-coded, my brain would answer (2) automatically. If the icon was blue in the corner of my eye, I knew I needed kanji. If it was green, kana. This skipped needing to make a conscious step and sped up reviewing significantly.

Now, it’s incredibly frustrating to enter the word the wrong way round only to get it wrong because ‘you didn’t read the question properly’. The colour-coding was an extremely effective way of keeping those mistakes to a minimum.

In essence, I don’t see how there is an advantage to removing the colours. There seems to only be a downside.

Thanks for listening to our feedback and I hope this sheds some light on the problem for some of us learners!

2 Likes

It would be helpful to get the color-coded marks back. Like @TinyCaterpillar and @bobdat have stated, Japanese has three alphabets Memrise tests on, and the colors stream-line review. I’ve gotten plenty of words wrong since this feature was removed because I overzealously typed and hit enter without having the color to remind me which exact alphabet was needed.

Also, so you’re just changing the interface? Or are there going to be more tweaks to the algorithm? Any notes would be appreciated, thanks.

1 Like

Thanks for the reply, @daniel.zohar. That is a shame, for reasons that @bobdat and @TheFour-GatedDanzig have already mentioned.

Yes, the problem’s with the typing tests. There is the text underneath the prompt to tell me what to answer with, but the color-coding was much more prominent, and easier to spot out of the corner of my eye while I was typing. As it is now, I need to make a conscious effort to check the small text for every single item during reviews, which slows things down and distracts me from the tests. So for what it’s worth, my vote would definitely be to bring it back in some form or another eventually.

In any case, it’s helpful to hear back on it, so thank you. I guess I have no choice but to simply proceed with building my courses for now. Maybe I could get some help with setting them to the right colors manually later on, if the feature’s ever added back? :crossed_fingers:

1 Like

@daniel.zohar @BeaTrisy
Hey guys, if possible could you comment a bit on the plans for the algorithm and course creation?
In many ways the website seems to be changing more towards the app, which as Ben Whately had mentioned in a post last year, is one of two directions memrise was taking at that time (the app that focuses on the gamified and huge customer count streamed in from the app stores and the website that contained a large amount of “oldschool” users who use memrise as a flashcard function.)

I, among others, use memorize as a very important part in my language study. I customize courses for myself try to contribute on a couple others as well as use other users’ courses. The algorithm that I’ve seen used on the website since I started in 2012 has helped me in a big way in learning vocabulary items.

Could you comment on any planned changes for the algorithm? Some things that have already been adapted from the app (like less punishment for incorrect answers) worry me because they just happen without prior warning. I also think the timetable for the app review was different than the website and when I tried the app before I was not able to retain words as efficiently (but I do think the app is useful for sentence-based courses and its portability).

Hopefully this wouldn’t be the plans, but in the case that the plans are to veer completely away from custom courses and the backlog of user-made courses and instead allow access to only mostly official courses, I do hope you would give a warning far in advance.
Also, if changes to the algorithm, which could hurt learnt word retention, are in the works, knowing ahead of time instead of later would be much appreciated because if it just happens, and then I don’t find out until I realize how poorly some words are sticking a month or two down the road, well that’s no fun and very frustrating.
I’ve mentioned before also, that instead of just pouring changes out, despite how I know simply being able to have a control system to quickly make changes is much easier to manage in handling these kinds of projects, struggling through aspects of a democracy with some of this stuff would be really helpful. I’m sure a lot of users would be willing to be guinea pigs as well for algorithm changes on a private server. A lot of us, having used memrise for years, have a larger experience with memory and memory retention than the general public just from diffusion and necessity. Tapping into that could help out memrise as well.

Thanks for your understanding and hopefully you can give us a better idea of what to expect. That way if certain things will be added/taken away, we can prepare the process of switching over certain things needed to other platforms and using memrise for things its best for. :sunflower:

1 Like

See, this is what infuriates me about customer service here. Both @BeaTrisy and @daniel.zohar have been here after your post, @Wuxian, yet we’ve received no message. For the most part, they treat users like our opinions are worthless and not worth the time responding to.

1 Like

Also, I swear the algorithm has been changed. I’m getting such mass amounts for review that I fear living without internet for one day. Anyone else experiencing this?

1 Like

@TheFour-GatedDanzig Yeah, I sort of tried to write out all my issues to see if they would respond and it might just be that it’s a sensitive subject and they don’t want to put out bad publicity. The recent involvement with the community is a stark contrast to when Ben Whately would answer questions years ago. Obviously that’s much harder now, but I think the reason that same attention isn’t coming through is because this was Ben’s baby. Also, whoever is delegating priority up top has these devs focused on the app in a big way. I understand if they can’t talk about changes, but a heads up would be a huge help. I’m getting through HSK 5 in Chinese right now and I think to migrate the course, and multiple test directions and learn the functionality of Anki, I’d need a few weeks to get back on track, and that’s if I wasn’t demotivated by the process. I’m kind of just waiting for enough evidence from certain ignored questions that makes it obvious that i’m going to need to migrate. I mean, what would be the alternative? Staying on memrise and waiting for new official courses that I need like people wait for new iphones? It’s also very frustrating because there are just so many things being ignored, that to address them you need a good amount of text and criticism. This looks like drama to newer users and you get this reverse backlash.
Hopefully everything works out well, they seemed to quickly address the on-screen keyboard and auto-accept function this week. It’s just silly to me, I mean, why are they addressing these small features that pile up into user frustration when, for instance, a super obvious they they could work on is: I need to wait about 30 seconds to a minute to load my dashboard screen. Sometimes 1-2 courses pop up and the rest take a minute, sometimes it all takes a bit to load. Surely simple page loading is a priority vs. new features and the ziggy character?

Also, yeah I’ve been getting pretty heavy review loads, I had chalked it up to just one of those days where the stars align and a lot of words overlap.
To be honest, if they’ve added a review session in the timetable somewhere, I wouldn’t hate it if it was logical. My biggest concern is that I felt like in the app algorithm, words were lacking a crucial review session somewhere down the line. Either way, would love to know what they’re up to.

If you think about it, it’s a pretty tempting situation. Memrise happens to be in the learning category, and when promising people a way to magically learn things, that’s a naturally huge demand. With a small amount of marketing, the desperation and easy buys in an app store probably pumped tons of new revenue and bug-needs their way. This drove in new staff who don’t fully get memrise’s history, and, in my personal hypotheses, a head of marketing that was versed in a kid-friendly avenue that already had success in some company, and transferred that thought process over to sort of “clone” the formula of “do this and you’ll have a successful app.” The randomness of priority totally mimics environments I’ve experienced in small companies where one person has dominance in delegating in avenues unfamiliar and others are following. So, lots of different things crossing over in the company’s direction and the website is right in the middle. :sweat_smile:

5 Likes

“I need to wait about 30 seconds to a minute to load my dashboard screen.” I totally thought this was a me problem; now I’m glad I have something else to complain about! You’ve rightly highlighted the crux of the issue: the staff adding and adding features no one has asked for or anticipated because they’re not functionally useful, whereas month-long discussions about certain changes go ignored.

I think something has changed with the intervals of red/completely wrong answers. Now it seems like the system waits over 15 hours to give you those words back. That means, if I do my review sessions in the morning at eight, I can’t tackle those words again until the following morning, which will have additionally words anyhow. The snowball effect is horrible. They keep talking about the program changing if you’re an advanced user. If this is somehow connected to the Ziggy system (the interval change), then props to the developer for being a lummox, coming up with the two in a pair. They’ve definitely hired some random people who don’t get memrise. Probably were in the kid-app business but also maybe did an A-level in some foreign language lol. I can’t really fathom dedicated people who use this site implementing these asinine changes.

Your mentioning of anki worries me because… yes, I think we’re deluding ourselves into sticking around. The site continues to get worse and worse… but all of the time I’ve invested here, not to mention some really great courses from users… it sickens me to think we had that huge debate about getting mems back, only for them to tinker with the algorithm again to faze out your selected mems after a couple of days. If they get rid of user-created courses without a heads up – my blood boils just thinking about that, especially because, with the direction we’ve been going in, I can see them doing so.

5 Likes

I think some people brought this up already in this thread, but I’ll just bring it up again (sigh). My JLPT courses once had different colours for the question mark icon that signified if it was a kanji or kana input question, but now they’re all purple, and this is unbelievably disorienting. Those colour switches allowed me to instantly have a nice reflex and practice smoothly, but now I have to stop and read the text to see if it’s kana or kanji. It’s made it impossible for me to seriously practice and I’ve allowed my words-to-review to stack up into the 1600s. I also see, by the way, that Ziggy is still here…ugh.

2 Likes