Distance Learning in the University of Maine System, Part 1
Dec 14, 2010 12:00 PM, With Bennett Liles
What was the overall goal of the project?
Well, the overall goal was to replace the transport of the existing live video system. That's a statewide system; it interconnects all of the campuses of the University of Maine System, all of our off campus education centers. And originally that system blanketed the entire state with multiple receiver locations—something on the order of 80 locations—with a combination of fiber-optic network, microwave network, and some other small networks in between. So the goal was to upgrade all of those individual pieces of the old network and switch entirely to an IP-based transport. So that's what the Haivision equipment allowed us to do. [Timestamp 4:14]
And this is a fairly competitive field in IP streaming. Any special reason why you decided to go with Haivision on this one?
In our case, we had been using the original Haivision hardware as part of our backbone originally. We had their old MPEG-2 series that they called the hai 560 and we were using that for IP-based MPEG-2 transport. We had about 20 or 30 of those pieces of gear, and that had proven to be extremely reliable for us. So from a hardware standpoint, Haivision was already a known entity and we were experienced with their way of doing things with their equipment so that was an easy choice. [Timestamp 5:02]
Well yeah, especially on a huge project like this one. If you're already familiar with the reliability of a specific make of gear, it's probably smart to stick with what you know.
Right, well the other thing that made that more attractive was that they had married with the video Furnace software piece so the hardware and the software together now gave us familiar hardware but a total management system so we were able to look at the codecs on the one end, network performance in between, the set-top boxes at the far end, and had this scheduling piece and management all rolled up into one integrated package instead of using solutions from different vendors, so that also made it very attractive. [Timestamp 5:49]
Obviously better than starting with something from the ground up that you're crossing your fingers on and going on faith and referrals. So we've talked a bit about the delivery mechanism, but what about the content? What's being delivered on this, and what's the scope of this thing?
This fall we're offering 84 courses, I believe, just over 80 courses. There are about 3,800 students enrolled taking the live television courses, and there are somewhere just over 60 locations statewide where they're taking those courses. So that's running from 7:30 in the morning to about 10 p.m. at night and usually four concurrent live classes and that's pretty normal to have about four live classes running simultaneously. [Timestamp 6:38]
And that might not sound like a whole lot, but it's no small task when you've got that much stuff going on all at same time. Once you get a routine going from production through engineering, I guess everything smoothes out, but trying to get that ball rolling is a big job. So how do they do the production in the classrooms? Do presenters, I guess faculty, do any kind of control on the equipment for this?
No, they don't have to do anything they don't want to. Actually that's … the original goal of the project was to just be able to have a faculty member or a presenter walk in, walk up to the lectern, and start teaching. So they don't have to do anything. There aren't any touchpanels, no kinds of controls or buttons, or anything for them to actually operate. They clip on a wireless microphone, they walk up to the lectern, and they could just teach and leave everything up to the classroom technician—that's staff members that we call distance education technicians or DETs. And they're responsible for moving the cameras, controlling the audio sources, intercepting phone calls from students at the remote locations, manipulating all the graphics, filling in the content on the chromo key green screen, recording, play back—everything. So some faculty members actually do design their own graphics for the course, their own computer graphics, and they might choose to use the instructor PC that's at the lectern at the front of the room, and they might advance the slides or do the webpage presentation, but they don't have to do that. That can be done entirely by the DETs that are working in the control room. [Timestamp 8:25]
And where's all the encoding gear located? Is it right there in the production studio or is the program signal transmitted through some other means to it in a central location somewhere?
Well, the equipment is now located in the production classroom. Previously we had to transport back to the core codecs and use fiber-optic networks to get from building to building and that kind of thing, but at almost every location now the Barracuda encoders are located physically in the production classroom, right with the equipment, and it's just connected to the LAN right there in the room so that's actually made it easy. [Timestamp 9:02]
So they're pretty much in the care and feeding of the distance education technicians?
Well there are exceptions. At this campus, for instance, there's a larger video facility—a sort of a main origination terminal—and there's other systems passing through, satellite systems, video conferencing systems, a recording studio, editing rooms, and other centrally located equipment. So here we've located the encoders in the central … the video control center so that we can manipulate things and occasionally we have to pass the classrooms through a captioning encoder or tie the two systems together but generally, yes, the encoders are just sitting with the technicians in the equipment rack with all the production equipment. [Timestamp 9:48]
Acceptable Use Policy blog comments powered by Disqus