Webcast camera angle...

As far as I am concerned the webcasts are pretty useless.

I know that webcasting is limited by bandwidth issues, but I am not talking about that.

What I think ruins the broadcast is that FIRST is feeding whatever they put on the projection screen TV to the webcast.

While goofy camera angles and close up of ball jams and broken robots may be exactly what folks AT THE EVENT want to see on the big screen, they are only confusing and fustrating to someone offsite trying to figure out what is going on.

I am BEGGING FIRST & NASA TO PLEASE give us a static full field shot and leave it at that.

It will be more useful for the teams trying to learn the game and more entertaining to watch for those who are just trying to see what FIRST is all about.

Joe J.

I agree, and if we had that the video might be clearer because of the nature of live compressed video, I does not take backround motion well ( such as a camera moving and not just the robots) what this format does is only transmit things that are moving while things that are still are looped at the user end such as a field backround ( since they are still they dont need to be re transmitted since we all ready have it in memory, thereby saving bandwidth)

*Originally posted by Joe Johnson *
**
I am BEGGING FIRST & NASA TO PLEASE give us a static full field shot and leave it at that.

It will be more useful for the teams trying to learn the game and more entertaining to watch for those who are just trying to see what FIRST is all about.

Joe J. **

Joe -

The webcasts are pretty much limited to the video stream that FIRST provides to us, which (as you noted) is a tap from the video displayed on the big screen. There is not much NASA can do about what is shown on the video, but I will pass your comment on to FIRST.

I will note that your comments are just as applicable at the event itself. In many cases, the members of the audience may have less-than-optimal viewing positions (at least this was the case at VCU if you were seated on the sides of the field). The only real view you had of the action was the display on the big screen. While it may be “good TV” to see close-ups of the different mechanisms, it was virtually impossible to get the context of the images or to understand what was happening across the entire field. Maybe we need to build a “Goodyear Blimp-Cam” for the arenas!

-dave

I agree. I had no problem with the picture quality, but most of what I saw was closeups, and I couldn’t even tell what zone I was looking at, nevermind what direction each team was going. It looked like there was a camera mounted high in the stands at at least one of the regionals, and I relished the few wide shots they gave us. I really would’ve preferred Joe’s suggestion. Especially in this year’s game, where location is everything, and the zones look identical. However, I am happy that they are giving us a feed at all.

In a perfect world, I’d also like simultanious regionals (like KSC and VCU this weekend) to be synchronized, so that while a match is going on in one, the other is resetting the field, so I don’t have to watch two matches at one. However, I don’t think this is likely to happen.

Joe you can appoint someone at the regional to have a vitual playing field and move stuff on it as the game progresses! All shall me happy (exept for the person managing the virtual field)

P.S. Not it

I agree that it was hard for to keep track of scores just by watching the web cam… The camera was always zooming into a robot, or a part of the pushing war. However, it wasn’t all that bad after a few rounds.

I think what the camera man got to do is, by the end of a match, slowly pan across the field by focusing into one zone at a time. So, at the end of a match, he will stand just about around the center, look into zone 1, then zone two… until he got the whole thing. mean while, the announcer could work together with that and announce the result. But either way, they really need a place up at the stands to put the camera there. And, if they do this right, there won’t be any confusion to understand the score after each match.

Mean while, I thought what would work really well is a little game similator up at the corner of the screen, that shows position of the robots and goals.

If all we are doing is a static full field shot, not only would it be hard for people to look closely into each robot, it might be a pretty borning just watching little robots moving back and forth pushing around.

It was pretty interesting to see how each robot lock into the goal, how their extension works, and how they lock down on the field. So, if only we can have a mix of those two cam. It would be really nice if FIRST can afford a place to edit video footage on the fly, like a small crew that do live shows.

But that’s when FIRST get lots of sponsors.

Oh yeah, I remember FIRST have this camera moving around recording footage of the regionals, only that camera send the video to a seperate place from the webcast. You can bring in VCR to record footage from that cam, but it won’t show up on webcase.

I’ve watched many practice rounds, in which the camera was fixed and showed the whole field. It is one of the most frustrating experiences imaginable. Because of the size of the field, it is extremely hard to discern which robot is which, let alone what it is doing.

The only thing that I ask is that they stop zooming in on the drivers, as that isn’t very exciting not only to the people watching on the web, but the people at the competition too.

Joe,

I’d have to say that I do not think I agree with you, but I am willing to set up a test. I and a few others from NASA have been responsible for webcasting both the NASA/VCU and Philly regionals for the past several years, and I have spent a considerable amount of time watching different webcasts. I DO agree that sometimes what is displayed on screen is not even close to what remote or even local viewers really need to see in order to understand what is happening on the field. Sometimes the video is more of an “interest shot” - a close up of robot interactions, smoke beginning to come out of a robot, an robot flipped upside down with wheels madly spinning, etc, while the issues determining the outcome of the game are taking place off camera. But when my group does webcasts, we try and have them up for the Thursday practice rounds as well when the ONLY camera feed that is active from FIRST is usually the fixed, full-field view. And my personal thoughts are that while this does provide the big picture and may even be a better quality streamed image (due to a smaller percentage of the actual image changeing with each frame), you miss so much of the detail that it becomes rather maddening. BUT . . .

I’m willing to propose a test. If I can make the necessary arrangements to webcast the Philadelphia event in two weeks, I can provide a full-field, static view from a NASA camera, and patch in the FIRST audio (so that you would not hear the crowd noise from the camera mic and would get the annoiuncer’s play-by-play calls). IF (IF, IF, IF) there is interest on the part of enough of you (FIRST team members), AND you will provide feedback, then I will try and set this up: One full competition (Philly) with only a static camera angle of the entire field. Let me be clear that even if this test does prove to be the best way to webcast an event, it DOES NOT mean that NASA or FIRST would commit to doing things this way in the future - it owul djust be a data point to help make future decisions.

So now it is up to you - would you all really like to try this out?

My particular team will be playing the weekend of the PA regional, but I still would like for the experiement to go on.

Assuming that teams get better at strategy with each week, it should be a very strategic week of matches.

Perhaps there will not be enough detail to tell what is going on. I don’t know.

I vote for trying it. Any other voices?

Joe J.

My team, 365, was at the VCU Regional, and also will be at the Philly Regional. But I do agree that a test of this nature would provide good data points for determining the type of webcast to be used for future events. I personally think that a remote-controlled camera mounted above the spectator side of the field would give a good perspective for the webcast. I also like the idea of patching in the on-field announcer.

I’ve been thinking about this a lot, and I think the only way to make everyone happy, allowing viewers to keep track of score AND showing mechanisms and bots up close is to have two streams. One low bandwidth stream showing the full field, and one higher bandwidth stream showing what is on the big screen. This would probably be a lot of trouble for NASA to set up, especially since the streams would have to be in different formats to allow most people to view both at once.

EDIT – Ooh, I just had a better idea. I don’t know what kind of video processing equiptment NASA has at these events, but simply putting a Picture-in-Picture view of the entire field in the corner would allow one really usefull stream. I’m not sure how practical that is, but I know it is more reasonable than two streams.

For $100 or so you could have a student on the team (or someone esle that knows how the game is played and what to show) run a camcorder. The camera would be mostly full view shots, but could also zoom in on a subset of the field if the important action was there. The camera person would be told to do SLOW pans and not to stay zoomed in too close too long.

This would be a good compromise between a static full view camera and the MTV close ups that dominated the pictures I saw last weekend.

As to where the money comes from, let’s pass the hat – I will pitch in $10 to get us going. Where should I send my money? Any other donors?

Joe J.

P.S. Of course, I have no idea if this is even possible with the NASA set up, but if we don’t ask, we’ll never know. JJ

I don’t know who all is out there, but I can say from experience that directing live video (ie calling the camera shots) is very difficult. This is true even in a play where you supposedly know who is going where when.

The balance would seem to be 2-3 second closeups at critical moments followed by a return to an overall shot so that people can get re-oriented. The overall shot should be from the same place all of the time so people can stay oriented

It also means at least three good camera operators, a person telling the camera operators what shot they should have, somebody doing the switching, and finally somebody who really understands the game calling out which shots to switch to.

Oh and by the way if these are amatuers, you’d better have at least two complete teams you can swap out. After a couple of hours the first team is going to be fried. Come to think of it for football even the pros don’t work for more than a couple or three hours actually shooting. Remember a competition lasts all day and people will lose performance after a while. So it is essential to pace things.

It’s been several years since I did anything like this, and the equipment wasn’t anywhere near what is probably being used, but I’d be willing to jump in and help out at SJ or LA if somebody wants me too.:smiley: Especially if I knew I’d get spelled after a couple of hours or so.

Before we go too far here I think I should calrify a few things. First, I agree with Chris - getting good video is an art. Even with only one camera it is not the easiest thing to do, and with multiple cameras, you need a producer to call the shots. Who hasn’t gotten motion sickness from watching amateur home video?! Yes tripods can help, but selecting and framing shots well is truly a skill that must be learned. That aside . . .

The webcasts are one of NASA’s many contributions to FIRST and at each event NASA puts a considerable amount of effort into pulling them off by coordinating with FIRST staff, networking staff on-site, and the FIRST-contracted AV production crew. But at this point there is really no way to turn the webcasts into productions of their own. There are limited financial reousrces along with limited “people” resources, not to mention that all of the NASA people handling the webcasts have “real” jobs to do before and after the events.

So we ARE interested in figuring out how we can make the webcasts better, but there is very little chance of implementing any sort of large-scale web-based production - reality is that the webcasts are a “bonus” and not a critical element to the success of individual events. So keep the suggestions coming - I can promise you that we are listening and that all of us here at NASA doing the webcasts are avid FIRST supporters ourselves. Just don’t be too disapointed if the webcasts won’t be winning any Academy awards too soon! They are designed to be one way of allowing remote team members and other interested people to be able to be a little more connected to the event, but they will never be prime-time (or at least not any time soon!).