Passion 2013 Conference, Part 1
Apr 4, 2013 11:14 AM, With Bennett Liles
The staging for this thing was massive, but you were also doing a live web feed with, I would think, different sources going to that than were being fed to the big LED screens and ribbon displays in the dome. Did you have one truck handling the online show and the other one doing the other feeds?
We did. The two primary trucks each had a specific function and shared the cameras between them. Our larger expanding-side truck, “Aspiration,” was primarily responsible for the screen feed, which some would assume might be the secondary concern. But because there were so many destinations and so many different aspects, it took the priority and got the larger truck with the larger switcher. Our secondary truck—our 40ft. straight truck, “Inspiration,”—switched the Web feed. Both trucks received all 23 cameras. Both directors had full ability to communicate with each of the cameras, but the camera operators were able to keep up with who was using their feed by their tally light system. We integrated a multi-colored tally light system that gave the camera operators a red tally light if they were on an LED videoboard, a green tally light if they were on the Web, and a yellow tally light if they happened to be being used on both. And this made sure that all the truck staff, by watching their multiviewers, and the camera operators by watching their tally, were aware of who was live, when, and for how long. [Timestamp: 7:10]
And all that can be real easy to get totally messed up when everything gets going in the complexity of this. So little time to get in and get it all set up and make sure everything is working right. There were some very long cable runs. The Georgia Dome is a huge place so what type of video signal cabling did you run to the cameras?
The cameras were actually a SMPTE fiber-based product from Hitachi, and what that SMPTE fiber cable allows us to do is run great distances. But it’s still a single cable, much like triax that we were all used to for so long, but this is an optical fiber cable which gives me a little bit better reassurance about the signal integrity and the distances we can run. To get from the truck to the field was 500ft, and that 500ft. went through the drop ceiling of their security offices, through an air conditioning duct that was large enough for someone to walk through, and through two different corridors. So just the first 500ft. was tricky, and from there you’re only on the edge of the field. It was still a minimum of 500ft. to any other camera location. Many cameras were 1,500 or 2,000 total linear feet away from the truck, but because of that SMPTE fiber cable we were still able to shade, have full intercom communications, send return video and get great looking HD video back from the cameras with no problems. [Timestamp: 8:31]
I’ve seen the existing ribbon displays in the dome. Was there anything you fed to the inhouse system or was all of this set up by your team from the beginning?
In addition to the concourse plasma screens, we also fed the ribbon boards and all of the VIP suites. We did that with the aid of a Spyder video switcher that was located in front of house. It took care of some of that aspect resolution management for us. But all of the signals did originate from the truck, and we used a series of tactical fiber cables and the FiDO brick product from AJA Video to get HD-SDI from the truck inside the venue, approximately 1,000ft., and back out to the truck again. So the SMPTE fiber camera cables, along with our tactical fiber and AJA FiDO bricks, and our copper infrastructure for our intercom we used an Adam frame from RTS for intercom. And we had 10 key-panel intercom stations at front of house in addition to all of the key panel user stations in the truck, in addition to all of the traditional party-line intercom, to make sure that the venue was able to communicate with the truck effectively.
And the trucks were equipped with what type of routers?
We used routers from Harris. We have their platinum series routers, which we used in every possible way on this show. Not only do the platinum routers have their multiviewers integrated, in other words any signal that hits the router is available to be displayed on a multiviewer that you can control tally, UMD, any kind of scaling of the picture-in-picture elements, and there’s also some interactive elements. We were able to hyperactively place logos, so we were able to put Passion logos on the multiviewers. We were able to display live incoming RSS feeds, so I was able to have a scroll of tweets and Twitter accounts that were being used to mention things about the event all live on one of 12 discrete and custom multiviewers we built. So not only did the front monitor wall have custom multiviewers, we also allowed our producers to build multiviewers just for themselves or front of house. Our shaders in the truck, of which there were four, were able to build multiviewers showing the cameras they were working on, and even in our audio suites in the truck we were able to assemble a custom multiviewer, so our audio engineers were able to monitor the cameras that were relevant to them. [Timestamp: 10:45]
This show required a lot of people to handle. How many people were on your tech crew doing all these things?
You know, we’ve attempted to count that, and I don’t think anybody’s come up with a final number yet. But between two directors, AD’s, TD’s, 23 camera operators with grips and utilities, four shaders, three engineer-in-charge, a tech manager or two, and a host of A2’s and utilities—we had an army. We had an excellent team that was able to deploy this massive video system in less than a day, I might add. We didn’t gain access to the venue until around 2:00 a.m. on the day where doors opened for our first show at 6:00 p.m. So it was a long day, but essentially less than a day to load this entire video system in, and I’m proud to say now that we’re not onsite and I can’t jinx us, it worked flawlessly. We had everything that we expected to have on day one and we were even able to go in on day two and make some modifications and adjust lensing. But everything worked great on that first day. So we started at 2:00 a.m. and opened doors to the public at 6:00 p.m. and kicked off the week right. [Timestamp: 11:51]
All right. Nic Dugger, technical manager with TNDV Television in Nashville. It was a great event, Passion 2013 at the Georgia Dome, with little time to set up and a whole bunch to do. In part two we’ll get into the intercom setup and the audio feeds. It was great to have you here and thanks for taking us behind the scenes.
It’s my pleasure. I hope to see you in one of our trucks soon.
Acceptable Use Policy blog comments powered by Disqus