Screwed up results?

Well Autodesk finally sent back the results of the animations. I found my team’s to be quite odd. In the first round we got scores (each are out of 10 i assume) of 8, 7.2 and 7.4 from different judges … Ok, thats fine. The second round however we got average scores of 0.8, 1.2, 0.8, 6.4, 6.8, and 8.2 … Notice something out of place? Maybe the 2 scores below 1 and the 1.2?
I understand there could be some differences between judges, but a drop of almost 7 points?

Let me know if im screwed up, if the results are screwed up, or maybe if the whole world is screwed up.

Who did you contact exactly to get your scores back? I know that Autodesk has set up an email address every year where teams send in a message and they are supposed to recieve their scores but never do. I am wondering how you got your scores.

Sunny, I contacted that email, and they actually sent them over the weekend.

Hey fork, It aint you. I got the same thing. My lows werent quite that bad but they still didnt make sence to me. So I shot off an email to ask what kind of information they could give that the scores were based on.

I liked the first section where they listed (from an obvious scoring sheet checklist) information that showed what the judges saw in the animation. But the second round results were very strange.

I also asked that next year they include an anonimous bio on the judges so we can put weight to their scores. Frankly I’m not into taking blind advice. If they want the scores to be helpful, this would escalate the value of it.

If my 9’s and 10’s were given by a ‘non-experienced’ viewer and my 5’s and 6’s were given by a supper experienced, well trained creative person- I’d put more weight on the lower scores (and their comments) And of course vise versa.

So we’ll see if they respond. I send a suggestion in regards to a monday deadline and they responded favorably.

Note: I didn’t like the monday deadline because it took away about 2.5 full days of finetunning and left us finalzing for the sake of the deadline instread of satisfaction with results (kinda like my work) but this didnt need to be this way. Yes I know we had the deadline ahaed of time and we could have planned for it. BUT the second the robot ships, we have considerably less access after hours to the computer lab. And the deadline was the same weekend as our first regional and my two animators were off on thursday to help with the setup. So we had to finalize our renders by monday and have tues to revise any issues (thanks to Network rendering) and edit all day wednesday and Fed Ex on thursday, because from thursday to sat night weez-a-busy.
Had the deadline been Tues, we could have finetuned our edit on sat night and sunday and had monday for a team review and made Fed Ex by 7pm No-problem. Big difference in the results. We spent just 2or3 hours working on a “directors cut” and its a million times better (timing of text, transitions and such)

So… long story even longer- they responded favorably to this email, so I may hold my breath on this other issue too.

I worked with TheFork on our animation and I am very angry at the results. The judges gave us 0 and scores around that area without putting any comments, which seems even more suspicous to me! I think we were robbed of our chance to win!

Those judges do what they want because they are anonymous, I wanna see them do the same if their names or bio info was released.

I guess Autodesk finally got on the ball or they just have a thing against us cause we never recieved our scores before.

Do not get upset over your scores. The process is a very involved one and is a multi-step process that leads to a final winner. The reason your scores are so different is simple, the round one judging is not intented to be as scrutinizing as round two. This was very evident last year when a site briefly posted all of the scores for every team. Although the data was remved quickly, the trend of first to second round scoring being differnt, was blatantly obvious. The second round is intended to provide a winner, so the judging is much more rigorous. I suggest you watch the autodesk compilation reel for more information about scoring. In it, they clearly describe the process from start to finish.

Hope this helps,
Brandon

PS… Does anybody know if it is too late to get our scores? I really wanted to see them this year. What is the email address that you used to obtain scores?

first.entries@autodesk.com

i sent my email this afternoon (3:30 ish) and got my scoring sheet an hour and 20 minutes later

This is our score sheet and i can see why many may be suspicious of it. We did pretty good on the first round but on the second round the first 3 judges gave us an average of about 2.3 while the rest were an average of 7.3 . thats a difference of 5 points. also, the judges that scored higher have huge amounts of comments, while the low judges have none at all. i can see that they may be more experienced or something but they should at least tell you what you are doing wrong. i think they may need to explain this to everyone. even though it won’t change anything it could bring piece of mind.

team 862 scores.pdf (4.28 KB)


team 862 scores.pdf (4.28 KB)

I agree that there is a difference between rounds but the questionable thing to me is the score within the second round. But as a judge in Martial Arts competition, I know it doesnt matter what the scores are between judges, just what the consistency is between judges. If I’m a low scorer and the guy sitting next to me is a high scorer, but we each give our highest score to the same person and our lowest score to the same person, most others would filter in the standings pretty similarly and the results would be what it should be. My complaint wasnt the numbers (especially once I saw everyone had similat results) I’d really want to see the comments carried through the whole process (theyve gotta base the scores on something!) But as stated I want to know the background of each judge so i have an Idea of their perspective in arts and lafe itself.

PS we were asked to include a subject line (it may help in the filtering of the message)

Heres the entire link on one place

first.entries@autodesk.com Subject= “Autodesk Award, Request for Score, Team #xxx.”

you can try this but I dont know how it will parse in the forum

first.entries@autodesk.com?subject=Autodesk Award, Request for Score, Team #xxx.

MAKE SURE YOU CHANG YOUR TEAM NUMBER IN SUBJECT LINE

If that doesn’t work then type away, type away, type away all!(did anyone notice a holiday ring to that?)

Our team got screwed over compltely.

first of all our scores were extremely messed up but our average didn’t reflect that. Also it said that we forgot to leave a blank screen with all the required information which we did for about 10 seconds, I think there are a few things that could have gone wrong with the judging, but if someone from Autodesk could explain that would be greatly appreciated

Exactly…
And i also forgot to mention that they said we didn’t include many things such as a storyboard (which for sure we did), and an entry with a still frame being held with team number, ect (which we did include for sure too).
I think Autodesk has some justifying to do.

Our scores seemed to be pretty consistent. I want to see what team 114’s score sheet looks like. Anyone know if I can just request their score sheet? I don’t know of anyone on team 114 I can contact.

team 64 scores.pdf (4.53 KB)


team 64 scores.pdf (4.53 KB)

[GRR! I just finished writing a long post and it got accidentally deleted before I could post it…so here’s the somewhat nutshelled version]

[Heya Sunny, I met you at SoCal and nationals, at the animators’ meeting…so, in a really brief way, you know me. ;)]

I’m the animator (yes, singular) who did the animation for team 114, which won the grand prize… Our score sheet shows the same trend, where in the second round the first three judges gave lower scores than the others (their averages were 2-3 points lower than the rest). I guess they were simply working on a higher standard than the others.

About the scoresheet team 862 posted, while the first two scores were very low, they gave technical excellence reasonable scores. This make it seem a lot less odd to me than simply if all the scores were much lower (nod to 912).

However, we’re not going to get an explanation from Autodesk unless someone contacts them.

Anyway, if you want to take a more detailed look at our animation, the official-quality version (60MB cinepak .avi) is available here:
http://www.la.mvla.net/XtraCur/Clubs/2002Animation.avi
(the other people listed on the animation team were included to recognize their efforts in learning this year, but they did not work on the actual animation)

All the prize winners’ animations are viewable on Autodesk’s site in streaming Windows MediaPlayer format at:
http://usa.autodesk.com/adsk/item/0,,1972188-123112,00.html

Congrats to all…the overall quality of animations keeps improving each year (while I’m new to the team this year and will be graduating this year, I saw the compilation tape from last year). Regardless of whether your animation won, you’ve created something truly remarkable.

My animator partner this year apparantly sent our email a few weeks ago and we still haven’t gotten a response with our score. Maybe I should send another email…

Why did they have to stream the winning animations!!! I HATE my school’s firewall…can’t stream ANYTHING!!!

lol i didnt even get my results back yet, so idk how to help ya lol

*Originally posted by FizixMan *
**My animator partner this year apparantly sent our email a few weeks ago and we still haven’t gotten a response with our score. Maybe I should send another email…

Why did they have to stream the winning animations!!! I HATE my school’s firewall…can’t stream ANYTHING!!! **
There is software that will take a link to streaming media and just download it. I forget what it is called but I used to use it and I got it from www.download.com.

Thanks for the idea Sunny, I’ll try that at school (I’m not dl’ing these with my 56k!)

I just got our results back, and the trend continues however, our team wasn’t hit nearly as hard as some of you guys. Those same 3 judges did lower our score, but their scores for our team averaged about 6.5 (rather than the 1.5 for some of you guys). Our scores only dropped about 0.6 points (from 7.73 to 7.17) from round 1 to round 2.

I was wondering if anyone managed to receive an average of over 9 points or received many 10’s.

oh, here’s our scores if anyone’s interested.

Team 783 SWAT Mobotics animation scores

oops, I forgot angelfire is evil. Use this instead:

team_783_scores.pdf (4.12 KB)


team_783_scores.pdf (4.12 KB)