Super random question for everyone... When it comes to memory matching games (like the N-spade mini-game in SMB3), which is a better mechanism for high scores; time to match them all or an accuracy percentage at the end? I'm working on such a game for my iOS App, and I'm torn between which would be a better measurement.
Firstly, you could do both by using a points system instead:
Reward for good time and deduct for mistakes
Code:
(MaxNumSecondsForPoints - NumSecondsTaken) * PointsForTime - NumMistakes * PointsDeductedPerMistake
OR
Reward for good time and reward for good accuracy
Code:
(MaxNumSecondsForPoints - Math.Min(NumSecondsTaken, MaxNumSecondsForPoints )) * PointsForTime + (MaxNumMistakes - Math.Min(NumMistakes, MaxNumMistakes)) * PointsForAccuracy
BUT, those two are definitely a more complex system. It could be more fun due to the increased complexity, but also could be harder to judge where you stand while playing.
So, if you stick to just one metric, I would use time, because making mistakes cost time so you are indirectly measuring that anyway (especially if an animation is involved).