LongCut logo

Filming Video: CRTs On Set

By Cathode Ray Dude - CRD

Summary

Topics Covered

  • Genlock pots enable perfect phase
  • Film CRTs demand 48Hz sync
  • Frame rate mismatch births hacks
  • PC monitors crush TV fidelity
  • Interactivity trumps tape loops

Full Transcript

Something you might have picked up on if you've watched my channel before, is that I'm fascinated with film and television. Like, if I could do it all

television. Like, if I could do it all over again, I'd want to work in those industries, even as just a stage hand or something, cuz there's a kind of magic to the art form that I've never gotten over. Even though I know how the sausage

over. Even though I know how the sausage is made, and in some ways I even make it myself nowadays, my ability to suspend disbelief is still limitless. I know the sets in Star Trek are plywood, but they

feel like duranium to me, and that never stops feeling cool. So, I've always been fascinated by looking behind the scenes.

It's a remarkable sensation to look at a movie set, then at the finished product, and know they're the same, even though they feel completely different. So,

while I don't have the room or money to collect props, I understand the people who do. There's nothing actually

who do. There's nothing actually exciting about owning, I don't know, a horseshoe used on the set of MASH. But

if you have the right kind of brain damage, you never get tired of looking at it and thinking, "Damn, this was on that set at that moment. This is the thing that camera photographed 50 years

ago. Why does this matter?" I couldn't

ago. Why does this matter?" I couldn't tell you. But it is what it is. So when

tell you. But it is what it is. So when

someone offered to send me a stack of PCs that had supposedly been used in some capacity in Hollywood, I jumped at the opportunity. Even though I didn't

the opportunity. Even though I didn't expect them to be anything more than ordinary computers that happened to have been used in Hollywood, I thought there might be some interesting ephemera on

the drives if they were intact, but that was about it.

These are the machines in question, and at a glance, they do seem fairly ordinary. I'm pretty sure all three were

ordinary. I'm pretty sure all three were built around 1994. And well, other than being very dirty and having gaff tape and stuff all over them, they look pretty typical for the era. Uh they're

all 486-based. Uh two of them are definitely just off-the-shelf compacts, and I think the third was a Packard Bell at one point, although I can't be sure because all three have been debadged and

rebranded under the name Sparkology.

Now, Sparkology was a very small company based out of the Bay Area that hasn't been around for eons, but they aren't hard to look up since they had a website. Uh, it was run by a fellow

website. Uh, it was run by a fellow named Marty Brenice, who I think was the sole employee. And I'm going to say

sole employee. And I'm going to say upfront that he is still around and reachable. And I even got in touch with

reachable. And I even got in touch with him briefly. But unfortunately, he

him briefly. But unfortunately, he stopped responding to me, probably because he's off enjoying his retirement. So, while I had a number of

retirement. So, while I had a number of dangling questions that will come up in this video that I had hoped to get answers to, I'm not going to continue pestering the guy. If I end up getting a reply to any of my questions later, I'll

put the answers in the description for future viewers. But, it's not really a

future viewers. But, it's not really a big deal to me because to spoil the video a bit, these aren't even really what we're here to talk about. They're

just nice conversation starters. But, we

will start the conversation there. As I

said, these don't look special at all at a glance. Uh the two compacts in

a glance. Uh the two compacts in particular are pretty much bone stock.

On the front we've got the usual buttons and drives and on the back we've got the typical ports, your keyboard, your mouse, VGA, etc. Uh there aren't any add-on cards in either one of them. And

if we fire them up, they just boot into Windows 3.1 and 95 respectively. They

seem in every regard to be perfectly normal PCs. And in fact, while one of

normal PCs. And in fact, while one of them has no software at all, the other one is full of someone's office apps that were clearly being used to run an accounting business or something. So,

whatever these once did, they were later relegated to simple business machines, so they can't be that special. Uh, now

the third machine is a bit different.

Uh, the front looks normal enough at a glance. Uh, but if you stare long

glance. Uh, but if you stare long enough, you might notice that the reset button isn't a button. It's a slotted screw head that rotates but doesn't go in or out. And if you turn it around,

things get even stranger. Not only does the video card have these weird multi-pin plugs next to the VGA port, we also have a couple handlabeled switches and a set of BNC jacks labeled sync in,

sync loop, and sync out. So, while the average retro computer nerd might not have noticed the other two machines in a big stack of PCs, this one would give anyone pause. And there's a good chance

anyone pause. And there's a good chance if you looked inside any of the three, you'd notice some other things that are a miss. The compacts are again largely

a miss. The compacts are again largely stock. The insides look pretty much just

stock. The insides look pretty much just as they should, except for this little gadget here. For the electronically

gadget here. For the electronically uninced, this is a crystal oscillator, which produces a very specific reference frequency. And next to it is a

frequency. And next to it is a potentiometer for, well, presumably tuning that crystal. Uh, this is a mysterious object. It definitely doesn't

mysterious object. It definitely doesn't belong here. But nothing's labeled. So,

belong here. But nothing's labeled. So,

even if you noticed it, you'd probably never figure out what it did. And the

third machine is even more mysterious.

The inside is a rat's nest of tangled wires strewn hither and yawn. The

switches and jacks uh on the back are wired up to this little board here. Uh

there's some other wires that jump over to a handmade PCB attached to the video card. And even more wires lead to that

card. And even more wires lead to that strange screw looking thing on the front, which turns out to be another potentiometer. And again, nothing's

potentiometer. And again, nothing's labeled. So, good luck figuring any of

labeled. So, good luck figuring any of this out without the original hard drive contents. which fortunately I do have.

contents. which fortunately I do have.

Boy, that's a lot of glare.

Um, worse. Ah, better.

worse. Ah, better.

Oh, my old red light was off this whole time. I know what I'm doing.

time. I know what I'm doing.

Oh, okay. Hey, it actually posted. I I

thought I moved this thing uh from home to the studio and every time I move this thing more than 10 ft, it it doesn't work the next time I turn it on. So,

we'll take it. There's no CMOS battery and there can't be. So, I have to manually punch in the hard drive parameters. Fortunately or not, I've

parameters. Fortunately or not, I've memorized them, which is extremely embarrassing. 520 of God's own

embarrassing. 520 of God's own megabytes. All right, you may ask

megabytes. All right, you may ask yourself, why is the BIOS in black and white? I have no idea. It wasn't doing

white? I have no idea. It wasn't doing that until about 3 days ago, and now I can't get it to stop.

Anyway, firing the machine up, we get the usual BIOS spew. Uh, and then we should get a Windows 3.1 splash screen. Uh, remember

this was 94, so that was still cutting edge. Uh, and then I believe Where is

edge. Uh, and then I believe Where is it? Where is it? Where is it?

it? Where is it? Where is it?

Yeah. So, the sound card is hooked up to the PC speaker. And uh, if you heard that, you probably understand now why that wasn't a common practice. Ow.

Anyway, once we land in Windows, we get the program manager for a moment. Uh,

then this uh, gray window appears and uh, fills up the whole screen and then does nothing. Uh there's a ton of

does nothing. Uh there's a ton of options in these menus, but none are particularly self-explanatory. It's it's

particularly self-explanatory. It's it's all um it's all pretty mysterious stuff.

Uh and none of them appear to be the focus of the app, and there's no help file. So,

file. So, things are going great so far. Uh now,

this is in fact all about what makes this machine special, but it won't mean much unless we discuss some of the realities of using a computer on a film set. So, let's do that. Suppose you're

set. So, let's do that. Suppose you're

making a movie and you have a scene with a bunch of computers in it. You're going

to want things that look convincingly like functioning computers, right? And

the best solution is to just buy some computers. It's like if you have a scene

computers. It's like if you have a scene set in a garage, you'll need a car to put in it. And all that has to do is look like a car, which it does by being a car. So, you just go out and buy a

a car. So, you just go out and buy a car. And despite it technically being a

car. And despite it technically being a movie prop, there won't be anything special about it. I mean, unless it has to fly later. Uh, and to some extent, these machines are the same deal.

They're fairly normal devices because sometimes you just need to film a Windows desktop or some ordinary piece of commercial software. And you know, if you think about it, that's exactly

what's happening right now, isn't it?

This machine is doing its job. Uh, and

as you'll know if you've watched my videos before, you don't need a special computer for that. Anyone will do the job. By casting a spell, I've retrieved

job. By casting a spell, I've retrieved a random specimen from my warehouse, uh, plus a monitor that more or less goes with it. And as you can see, it too is

with it. And as you can see, it too is doing a perfectly good job of appearing in a movie. So, I mean, that that is what I'm making here, right? No, you

disagree. You say this isn't a movie, just a video. Well, that's narrow-minded of you. I mean, it's correct, but still,

of you. I mean, it's correct, but still, these terms have a lot less meaning than they used to, at least in any kind of literal dictionary sense. Originally

movie meant a moving picture shot on film, while video meant one produced electronically. But since almost

electronically. But since almost everything is digital now, nobody makes that distinction anymore. Uh, it also has nothing to do with content. A movie

can be fiction or non large in scope or tiny shown in theaters or just streaming services. Everyone agrees that the

services. Everyone agrees that the talking heads production stop making sense is a movie even though it's just a fancy version of a concert cam vid. So,

I don't think there's a hard and fast definition here, except for the frame rate. Just about everything made in the

rate. Just about everything made in the last 100 years that's been called a movie was shot or at least presented at 24 frames per second, and this isn't. I

shoot my videos at 60 fps, which is one of the two agreed upon options for North American YouTubers. Uh, the other is 30,

American YouTubers. Uh, the other is 30, which is much more popular, though I hate it for reasons I'll get into later.

The point though is that I'm not shooting this in a movie like fashion.

So, nobody would ever call it that. But

the thing I shoot with is a Blackmagic Pocket Cinema 6K Pro, which as the name suggests is meant for shooting things in a movie like fashion. So, if I want, it

can shoot at 24 frames per second. And

this is a really good moment to warn anyone with visual sensitivity issues that the rest of this video is going to be completely unwatchable. Like, I'm not kidding here. particularly if you have

kidding here. particularly if you have visually triggered epilepsy. Please

close the window now because when we zoom out, we'll find that neither of our PCs is doing such a good job anymore. A

lot of people say they don't understand why I shoot my videos at 60 fps. They

think I'm out of my mind, that it's a terrible waste of bandwidth and space and there's no benefit for the kind of work I do. And I disagree on all of those points. It's absolutely bizarre to

those points. It's absolutely bizarre to me that people would look at a visual art form and say, "Hey, explain why you made that visual choice." I don't know, man. Probably cuz I thought it looked

man. Probably cuz I thought it looked good. The fact is 30 fps has always felt

good. The fact is 30 fps has always felt like staring into a strobe light for me.

I find it jittery and unpleasant. And 24

fps is even worse. So, when I started my channel, I decided to impose my aesthetic preferences on you. Something

no creator has ever done before, I'm sure. But also, since I started out

sure. But also, since I started out doing a lot of videos involving CRTs, it worked out really well for that, too.

For reasons you can probably tell. Had I

decided I wanted a cinematic look to my channel and gone with 24 fps, this is what TVs might have looked like. Uh,

these are, if I did my math right, uh, strobing at about 12 hertz, which I believe falls smack dab in the middle of the epilepsy danger zone. So, I was not kidding about that. But even if you

aren't in literal danger, this is still pretty miserable and it's pretty much universal. I'd be having this exact

universal. I'd be having this exact problem with any off-the-shelf PC available in 1994, regardless of configuration and any standard television set, whether NTSC or PAL. All

CRTs that you could buy on the open market would flicker like this. But what

makes the Sparkology PC special is that I can reach over, click a menu item, and just like that, the problem's

solved. Now, that's a pretty neat trick,

solved. Now, that's a pretty neat trick, but it's so quick and clean that you got to imagine what it's doing ain't that special. And you'd be right. I had a

special. And you'd be right. I had a hard time putting this script together, actually, because I kept trying to figure out how to make a whole topic out of this one machine. Well, three

machines, but the other two are even less interesting. You'll find out later.

less interesting. You'll find out later.

Anyway, in the end, I realized there's just not that much to it. So, here's the spoiler. All I've done is change this

spoiler. All I've done is change this monitor's refresh rate. It was on 60 Hz, which is what this one is on. Now it's

on 48, which matches the frame rate of my camera. And from a technological

my camera. And from a technological standpoint, that's a nothing burger.

It's like if you asked, "How do I charge my phone from this wall outlet? The

voltage is too high." And I replied, "Use an AC adapter." How do you take out a Phillips head screw? Phillips head

screwdriver. Thank you. That'll be $5. I

wanted to drag it out, but it's just so simple and obvious. if you have any grounding in the subject matter. And I

mean, if you do know much about CRTs or old PCs, you may recognize this as an unusual feature, but it's one that takes like 2 minutes to explain here. Because

CRTs don't draw their pictures instantaneously, and because movie cameras only take pictures intermittently, if a CRT runs at a different speed than your camera, you can get a dark or white band on the

screen due to the mismatch in timing.

And while just about any PC from the early 90s onward could output video at variable refresh rates, the universal default was 60 Hz. And by the '9s, all video modes with lower frequencies were

defunct. So, while it was technically

defunct. So, while it was technically possible to reduce your refresh rate, no off-the-shelf graphics card was ever designed to generate a 48 Hz signal. So,

the aftermarket crystal oscillators added to these machines provide the necessary reference frequency. There,

that's all there is to it. You see,

there's there's just not that much to say about this technically speaking. And

honestly, if all you want to see is how this machine was used, you can skip to the last chapter. I'll do a few more demos, show you some of the neat stuff on the hard drive. But the thing is, to me, while that's kind of cool, it's

mostly just a tour of how one long defunct company handled onset graphics once upon a time. uh the solution and what it was used for are neat, but all the factors that led up to that solution

are more interesting to think about in my opinion, particularly if you don't pre-possess the fundamentals. So, I

decided to just go over all of it basically like from the beginning. And

to be clear, this is not a master class.

I don't really know what I'm talking about from experience or education. I

never went to film school. I had to Google how camera shutters worked. And

I've never been in the same room as a film movie camera. the topic of this whole video. So, almost the whole thing

whole video. So, almost the whole thing is theory and hypothesis and conjecture, but I think I got it mostly right and what's not is speculation that I don't think anyone can answer for sure. So, I

decided to just go for it. I expect to get some corrections, which I'll put in a pinned comment, but I can't get corrected unless I'm wrong first. So,

let's make some breathing room and get to it. I'd guess CRT Flicker is a

to it. I'd guess CRT Flicker is a familiar site to a larger than average portion of my audience, at least compared to most channels. Anyone who's

pointed a smartphone or camcorder at a picture tube has probably encountered it. And some of you know how to fix it.

it. And some of you know how to fix it.

While for others, it's just an irritating problem, but it's probably a mystery to very few of you. Cathode ray

tubes are pretty widely understood at this point. I mean, if you can explain

this point. I mean, if you can explain cathode ray dudes, let me know. I have a lot of questions, but I think everyone watching probably knows the basics of a picture tube. Still, for the

picture tube. Still, for the uninitiated, it's like this. CRTs

consist of a glass envelope containing an electron emitter or gun at one end and a rectangle of phosphor at the other. The gun produces a stream of

other. The gun produces a stream of electrons which are steered by magnetic coils. And wherever the beam strikes the

coils. And wherever the beam strikes the screen, the phosphor fluoresence, producing a dot of light. And to be clear, this is literally a dot. In this

high-speed footage, courtesy of the slow-mo guys, you can see that it really does illuminate just one tiny spot.

However, that spot moves really fast.

This video is shot at 380,000 fps. So

this freeze frame represents 1380,000th of a second. If we move down to 2500 fps or a 2500th of a second, now it looks like there's a whole line illuminated.

At 240 fps, it looks like about a quarter of the screen is lit up. And to

my studio camera, it just looks like a solid image. Now clearly, as we can see,

solid image. Now clearly, as we can see, the screen never really is all lit up at once. But two effects produce the

once. But two effects produce the illusion. First, the phosphor has a

illusion. First, the phosphor has a certain amount of persistence. It takes

time to stop glowing after the beam strikes it, though less than you might think. And second, our vision has a

think. And second, our vision has a certain amount of persistence, known as persistence of vision. While we don't see the world at a specific frame rate, there is a limit to how fast our visual

system can react. So, it tends to blur things together if they change quickly enough, both literally in the sense of motion blur, but also cognitively. If we

see a series of largely identical images with just incremental changes, we perceive it as continuous motion due to essentially a neurological version of the Kulishawave effect. Film students

know this one very well. If you show someone several disjointed scenes, but there's some way to explain how one logically follows the other, then it feels like it's all one sequence. It's

why movies work at all. Likewise, when

we see something in one place and a moment later it's in another, our brains don't go, "Oh, no idea what that was about." Instead, they perform this

about." Instead, they perform this astonishingly powerful act of analog computing, working the situation in reverse to figure out what must have happened and then report that to us as

fact. This is the underpinning of many

fact. This is the underpinning of many optical illusions, but the motion picture is chief among them. When you

display a series of photos, we perceive it as continuous motion. And while there have been disagreements over how slow you can go before the illusion stops working, we usually don't need to worry

about that because it's somewhere south of 24 frames per second. That was the number chosen by the film industry back in the 1920s as the lowest film speed that could support sound. And while we

no longer have that limitation, pretty much nobody is interested in making movies slower than that anymore.

However, many people make movies that go much faster. Your typical 20th century

much faster. Your typical 20th century American television, for instance, ran at nearly 60 frames per second. We'll

come back to that nearly. It does a lot of lifting here, but it's close enough.

Also, since analog TVs all used interlaced video, there's a complicated thing going on involving fields and frames, which I'll explain later, but it doesn't matter at the moment. What's

important is a standard American television produces a recognizable image about once every 60th of a second.

That's well above the speed our eyes demand. So it looks like a moving

demand. So it looks like a moving picture to us and it can to a camera as well since they have their own persistence of vision. Back in the days of film, when a camera took a photo, it opened its shutter, the opaque cover

over the film for a certain length of time during which light collected on the film and built up a chemical impression.

Now that time interval was critical, leaving the shutter open for longer captured more light and made a brighter image. But any moving objects would

image. But any moving objects would become blurry the longer it was open. So

to get the sharpest picture, you generally wanted the fastest speed that lighting conditions permitted. Now

modern cameras are all digital and their sensors can be electrically erased and then read out again very rapidly. So

they don't technically need shutters and many cameras don't have them. Uh your

phone is a good example. Uh it just wipes the camera sensor, waits a moment for an image to form, then offloads the contents and there you go, a photo. Uh

the trouble is this process isn't instantaneous and that causes a phenomenon called rolling shutter where the image uh can get skewed if things were moving very quickly when you took

it. Uh now uh high-end digital cameras

it. Uh now uh high-end digital cameras often still have this problem and they resolve it by continuing to use actual physical shutters. Uh most mirrorless

physical shutters. Uh most mirrorless cameras for instance still have them. Uh

generally speaking however modern motion picture cameras don't have actual physical shutters. They have solved this

physical shutters. They have solved this by uh adopting something called a global shutter at the sensor level which I don't understand but it doesn't have this problem. All the same the concept

this problem. All the same the concept is still relevant. The camera still has to wait a specific length of time after clearing the sensor uh to allow an image to build up again before digitizing it.

And for the sake of convenience we still call that the shutter speed. Now, since

a CRT builds up an image dot by dot, line by line, if you leave your camera's shutter open long enough, all those dots and lines accumulate into a complete

picture, but you have to leave it open for exactly the right length of time, no more and no less. Unfortunately, most

modern cameras let you adjust this time period. The Blackmagic that you're

period. The Blackmagic that you're watching me through right now is recording at 60 frames per second. And

since I like having a little bit of motion blur, I've left the shutter speed maxed out. Shooting at 60 fps means the

maxed out. Shooting at 60 fps means the camera has to take a picture once every 60th of a second. So that's the slowest possible speed, 160th.

Now, I can decrease that if I want to.

If I want my motion to be sharper, I can adjust the shutter speed to more like a 120th of a second or even smaller values down to like a thousandth. So if I was recording something moving at really high speeds, that would let me get clean

freeze frames. I would just have to make

freeze frames. I would just have to make my iris bigger, turn up my ISO, maybe push the exposure in post to make it as bright as I want it to be. Generally

speaking though, I prefer 1/60th. Not

only do I think it looks good for my usual talking head and showand tell scenes, it also captures the most light and it matches the speed of most CRT displays found on this continent. So

when I have say a normal TV set on here, it usually looks terrific because it's running at the same speed as the camera.

I mean, okay. So, NTSC televisions, the American sort, used a refresh rate of 59.94 hertz. The engineers wanted it to be 60,

hertz. The engineers wanted it to be 60, and in the black and white days, it was, but due to some complicated science problems, when they added color to the system, they had to reduce the frame

rate just a tiny bit. So, my camera is actually recording 0.06 06 frames per second faster than the TV is displaying, but that's usually close enough. In

practice, every time this camera takes a picture, it leaves the shutter open long enough for the TV to produce a complete image, more or less. In reality, if we look

closely at the TV, we'll occasionally see this little dim band scroll up the screen. It should be a straight line,

screen. It should be a straight line, but um due to the the weirdness of rolling shutter in the digital camera, uh it ends up having this weird curvature to it. If if this was straight onto the camera, it wouldn't look like

that. But the the point is it's not

that. But the the point is it's not perfect. Uh there is a little tiny gap

perfect. Uh there is a little tiny gap where the electron beam hadn't quite made it all the way down to the beginning of the previous frame. So, you

are seeing 0.06 06 frames worth of stale phosphor from the last time uh it updated the screen. Now in its heyday this never would have come up with any

kind of video camera because back then they all shot at the same 59.94.

Nowadays we have 60 fps cameras and we don't have to use them that way. I could

adjust this thing to shoot at 5994 but I don't have to do that weird math for anything else I do. So, I just decided to go with a clean 60. I mean, it fits

the monitor you're using anyway. You're

probably on a 60 Hz TV or PC monitor.

So, your computer would have to convert between frame rates if I did that. So,

for the vast majority of what I do, a clean 60 makes sense. And the only downside is I get uh this little artifact when I look at CRTs, which I don't do as often as you'd think for my

channel. And also, I think it looks

channel. And also, I think it looks great, so I don't care to do anything about it. And the thing is, even if my

about it. And the thing is, even if my tastes allowed me to shoot at 30 fps, that could also work pretty well. Uh, at

this frame rate, the upper shutter speed limit is a 30th of a second, and technically that's too long. It leaves

the shutter open long enough for the TV to draw two pictures which get blended together, and we'll uh come back to that fact much later. Basically, this is going to look better in some cases than

others. If you have high motion on the

others. If you have high motion on the screen, it it might be kind of unpleasant. But for the most part, for

unpleasant. But for the most part, for for most sources, it's it's going to look all right. And if I am concerned, if I do want a cleaner image, I do have the option of just adjusting the shutter. Since a 60th of a second is

shutter. Since a 60th of a second is smaller than a 30th, the camera can shoot at that speed. And it looks a little weird with all these black lines here, which we'll discuss later, but in

most cases, this will be perfectly usable as well. So all this stuff works pretty well together, but that's only because of the close relationship

between said stuff. After all, YouTube is a video website. I'm shooting a video and that is a video camera despite its cinematic aspirations. So all this

cinematic aspirations. So all this technology has the same roots. We use 30 and 60 fps, at least in North America, because that's what the TV industry wanted in the first place. When the

digital age lifted the limitations they faced, we quickly adopted a purer version of the product, which made a lot of things simpler. But this 2020s camera and this '90s television still share so

much DNA that one can pretty easily work with the other. And of course, this all happened because of the incremental march of technology. While there are many reasons that we never changed to

some totally new frame rate like 80 or 100, the biggest is that there was just never any opportunity. We never said, "All right, time to start from scratch."

Uh 60 fps was selected back in the black and white days. So when we went color, the signal had to work with the old monochrome sets, and we chose a frequency close enough to what they'd been using that it would still work.

Then when the first digital formats were developed, those had to work on the color and black and white sets that people already had. So they used all the same frame rates. And when HD came along, it may have delivered higher

resolutions, but it still had to be easy to broadcast existing standard deaf content and convert the HD stuff to work with the old TVs, even the black and white ones, and so on and so forth. Even

if there were compelling reasons to choose a totally new standard, it was never going to happen because of the continuous lineage that runs through all video technology, none of which overlaps

with film.

Although video technology got a lot better over the course of the 20th century, even by the 2000s, the vast majority of remotely respectable films were still filmed on film. The motion,

color response, light sensitivity, resolving power, and general mouth feel was so much better that it didn't really get replaced until the late 2000s, even for the crappiest of screenplays, at

least in the US. Uh elsewhere,

particularly the UK, a lot more stuff was shot on tape. uh perhaps most famously the entire run of Doctor Who which I think never switched to an all film process at any point. Uh in the 60s

and 70s they were using video for indoor shots and film for location shoots and then when they rebooted the series I think they went to all video uh inwards and out although at that point like 2005 that was becoming more acceptable even

in the states. Uh in any case though I've never dug deep enough to find out why video was so popular in the UK. I

realized there were financial issues to consider, but I mean the US had super lowbudget projects as well, and still everything was shot on film, no matter how cheap the production, unless it was

like a sitcom. And even then, a surprising number of those were also shot on film. And this turned out to be a pretty rad decision in the fullness of time since we now have HD releases of Star Trek, uh, The X-Files, and

Frraasier. But I have to admit, there is

Frraasier. But I have to admit, there is a real neat look to those dramas that were shot directly in PAL video. I I

mean, sure, 1970s Who had comet trails all over the place from the video tube cameras, but Rumpole looked like a million bucks. And seeing Leo Mccern in

million bucks. And seeing Leo Mccern in 50 FPS is so worth it that I actually picked up the whole DVD box set.

In any case, though, almost everything that could be called a movie was shot on film, and that produced more problems than you might think. Something I

suspect most people don't realize is that film and television were pretty much invented in parallel. One wasn't

really based on the other and the two industries didn't interact much for their first couple decades. So by the time anyone thought of broadcasting a film on TV or shooting film of a TV screen, the two technologies had

diverged most obviously in terms of frame rate. Uh TV had gone with 60 fps

frame rate. Uh TV had gone with 60 fps while film had settled on 24. uh for

lots of reasons. Those were set in stone a hundred years ago. And ever since and to the present day, it's been a big dividing wall between the two worlds because you can't cleanly convert

between frame rates that aren't multiples of one another. Uh you see, if you have 30 fps video and you want to show it at 60, that's no problem. You

just show every frame twice. And if you want to do it the other way around, that's also easy. You just skip every other frame. But how do you convert a

other frame. But how do you convert a 24-fps film for a 60fps display? Well,

the closest multiple is 30. But even

that doesn't divide cleanly. So you have to do a thing called 32 pull down where you display every odd frame three times and every even frame twice. This

produces an effect called jutdder which we've just been putting up with for a century because it's tolerable. And

there's also some nonsense going on there with frame blending. It it look the worst, but it still sucks. And

things are a lot worse when you're trying to do it the other way around.

See, film has the decency of being physical. Each frame is a literal

physical. Each frame is a literal photograph. So, you can point a camera

photograph. So, you can point a camera at it or even project it and point a camera at that. But analog video is very ephemeral. Uh, prior to the age of LCDs,

ephemeral. Uh, prior to the age of LCDs, each frame of video could only be viewed as a momentary flash of phosphoresence on the face of a picture tube. And there

was no easy way to make that flash look good on a movie camera. Let's dig into that. So, here's our TV again, and it

that. So, here's our TV again, and it still looks pretty good at 60 fps. But

when we switch to 24, it looks terrible, much like our PC did earlier. Not only

are we getting that unpleasant bright band flickering on the screen, but the picture is also uh very ghosty. And this

isn't even a worst case scenario. So, uh

let's make one. Now, we're looking at a frame counter. Every time the TV draws a

frame counter. Every time the TV draws a new picture, this number goes up from 1 to 60 and back again. And this would look fine at 60 fps, if a little flickery. See, it's not pleasant to look

flickery. See, it's not pleasant to look at, but it's tolerable. Here, however,

it's a disaster because of the irreconcilable differences in timing.

Uh, you see, my camera's shutter speed is currently set to 148th of a second.

And we'll talk later about why I chose that speed, but in short, it's a very common one for movie cameras and often the only option. So, in a 48th of a second interval, the TV has time to draw

one full image, then come around and draw a quarter of the next one before the camera's shutter closes. And this

has two effects that we can see if we slow this footage down. Uh, first, a quarter of the screen is twice as bright as the rest because the electron beam swept over it two times. the same thing

that was making our computer monitor flicker earlier. But in that case, the

flicker earlier. But in that case, the image was static. So all you saw was the bright band where it was getting overdrawn with the same picture. Here we

also get a confusing result because we're seeing one quarter of a very different picture stacked on top of the last one. So that should make the

last one. So that should make the problem clear enough. Now how did Hollywood deal with it? Naturally, there

were a few solutions. And one of the popular ones up through the 70s was to just not solve the problem at all. Uh

for instance, near the beginning of Dawn of the Dead, the studio monitors are just permitted to flicker. Now, this is partly a good thing. Uh this shot is very short, so it's not too unpleasant, but it's also meant to establish that

we're in a TV studio, and the flickering enhances that. But they also kind of had

enhances that. But they also kind of had no choice because the film was made on a budget of whatever George Romero had in his pocket. And all this gear that you

his pocket. And all this gear that you see here is the real thing from the Tektronics program monitors to the Chiron titler. I'm sure he just went

Chiron titler. I'm sure he just went into a local TV station and used what was there. But lest you think this is

was there. But lest you think this is just because George Romero shot this thing on 40 bucks. The 1976 movie Network uh produced for almost $4

million does the same thing. Virtually

every tube in the whole film flickers at all times and it looks fine for the most part. So, this was old school solution

part. So, this was old school solution number one and a very popular choice.

Option number two, however, can be seen near the end of Dawn of the Dead where Steven's watching TV, but when we get the reverse angle, it's clearly fake.

Uh, the picture is flat. There's no

reflection from the tube, and you can see the edges of the image jittering.

This is a technique called a burn-in, and you see it all throughout the 60s and 70s, especially in TV shows. Uh, for

instance, in the Columbbo episode Playback, the security monitors seen throughout the episode are genuine TVs, and that makes sense since they're the stars of the show and a pivotal plot point. But the football game Columbo

point. But the football game Columbo sees at a bar, which gives them the idea to wrap up the case, that's another burn-in. So, the way this works is they

burn-in. So, the way this works is they basically take a single frame of film showing a prop TV, then a film strip containing footage they want to insert, and put them together in an optical

printer, which projects them both onto a new film strip. Then they insert a mask or mat with a cutout of the TV screen, creating a hole in the still image through which the underlying footage can

be seen. Now, this is actually a lot of

be seen. Now, this is actually a lot of the same steps as the traveling mat process, which we call chroma key these days. And even though it's much less

days. And even though it's much less complex, especially when working with actual film, it still involves an optical printer. So, I have to assume

optical printer. So, I have to assume it's not exactly a cheap process. And it

doesn't look great. Um, the picture is very unnatural looking. There's jitter

around the edges caused by the destination strip not being held perfectly still relative to the mat. And

you can actually see the uneven line where they cut the mat out by hand in the upper right corner. And all this wouldn't be so bad for Columbbo, which was shown on standard deaf TVs, but it

would have been noticeably ugly in a feature film. So, I suspect this was

feature film. So, I suspect this was usually used out of convenience uh and when other techniques weren't available.

And sure enough, uh, while this method actually hung on for a very long time, you do tend to see it only in lower budget productions in later years. Uh,

if we look at the music video for Fatboy Slim's 2001 track, Push the Tempo, for instance, the TV at the beginning is also clearly a burn-in, albeit one done in Final Cut Pro, and it doesn't look

great, but hey, it's a brief shot in a music video. Who cares? Likewise, a ton

music video. Who cares? Likewise, a ton of productions nowadays just add computer screens in post. And sometimes

it doesn't look so great, but it gets done often enough. So clearly,

filmmakers throughout the ages have often agreed with the adage that real winners quit. But there were several

winners quit. But there were several proper solutions. The first one was very

proper solutions. The first one was very simple. You just filmed a totally

simple. You just filmed a totally ordinary TV with a totally ordinary camera because it turns out that can work at least to a point. Uh you'll

recall me saying earlier that since a 60th fits inside of a 30th, you can get a clean image when shooting at 30 fps by just shrinking the shutter interval. And

obviously the same is true at 24 fps cuz this is that. I've turned my camera's shutter down to a 60th of a second. And

now we have a good-look picture. Uh so

this seems like a pretty good solution, but there are still some problems, not all of which I fully understand. Uh

obviously compared to the test we did earlier with a 148 shutter, this is a huge improvement. we no longer have

huge improvement. we no longer have overlapping images. So, it's a much

overlapping images. So, it's a much cleaner looking picture. But if we uh flip over to the frame counter here again, it looks pretty good. But if we slow it down, you can see that it's not

actually as clean as it looked at first glance. Uh we don't have overlap

glance. Uh we don't have overlap anymore, but each frame is still a combination of two pictures. Uh there's

a split in the middle caused by the lack of sync between the camera and display.

Uh these may be running at the same speed, but they're out of phase. So, the

camera's shutter keeps opening when the TV is in the middle of one frame and closing as it's partway through the next. Uh, and worse, as we step forward,

next. Uh, and worse, as we step forward, that split point keeps drifting upwards.

So, we do still have problems. And what's going on here might not be entirely clear. So, let me give you a

entirely clear. So, let me give you a little illustration. Regardless of

little illustration. Regardless of shutter speed, the camera has to take a picture once every 24th of a second. And

once it starts running, it has to hit that cadence with absolute consistency.

It can spend more or less time taking each picture, but it has to open the shutter at the same point within each interval. And once it closes, it can't

interval. And once it closes, it can't open it again until the next 24th of a second comes along. Otherwise, the

resulting motion will look unnatural.

Meanwhile, the TV has to generate a picture once every 60th of a second. And

that means the phase between the two is constantly drifting. The camera may land

constantly drifting. The camera may land on a clean frame occasionally, but mostly you'll get what gamers call tearing, where each frame contains a fraction of the previous one. Now, this

might not matter that much, especially if there's a relatively static image on the screen. You're not going to see the

the screen. You're not going to see the split in that case cuz nothing's really changing, but you're still going to have the uh black bar that we saw earlier during the uh 60 fps tests. And this

artifact was well known back in the film days, uh often referred to as a roll bar or scan bar. And while we'd certainly prefer to get rid of it, I get the strong impression that this isn't nearly

as bad as it used to be. Uh, and for some clarity on that, let's talk about how movie camera shutters used to work.

Here's yet another clip from the slow-mo guys where we see a partly disassembled 16mm movie camera. And while this is a 1970s consumer model from Russia, it works pretty much like every other one.

That rotating disc there is the shutter.

So as that spins, it covers and uncovers the film gate, which exposes the film for a fixed length of time determined by the size of the gaps in the disc. And

this is why if you ever read about film making, you'll often see shutter speeds described in terms of angle. An enormous

number of cameras have a shutter that looks like this, a half moonshaped disc.

This is called a 180° shutter because the plate covers exactly half its radius, 180°. And when you spin this at

radius, 180°. And when you spin this at 24 RPM, it exposes each film frame for 148th of a second, exactly half the interval. As far as I can tell, this is

interval. As far as I can tell, this is considered the default shutter angle for film making. There are many cameras,

film making. There are many cameras, especially lower-end models, that can only deliver that speed. But I get the impression that it's generally assumed that you're using a 180° shutter unless you have some reason not to. I'm

embarrassed to admit that I don't really know why offhand. I I haven't really seen anybody explain why that's the number, but it seems very common. And

like I said, in some cases, it's literally the only option, which is why I've been sticking to that shutter speed for all the uh 24 fps tests in this video. Now, in the camera that Gavin was

video. Now, in the camera that Gavin was demoing, they used another approach where the shutter has two blades and spins at 12 RPM. I believe this is referred to as a bow tie shutter. And

I'm not sure why and and when these are used, but it seems to be a tossup which one you'll get. Some cameras have one type, some have the other, but it doesn't really matter because if you add up the combined radius of the two gaps,

you'll get an equivalent number. So, a

180 degree bow tie would thus have two 90° gaps. Now, if we look closely at

90° gaps. Now, if we look closely at Gavin's example, uh you'll see those gaps don't add up to 180. They're closer

to 72 because this is a 144° shutter. Uh

that produces a shutter speed of about 160th of a second, which actually means this camera would cope pretty well with CRTs. though, ironically, not the ones

CRTs. though, ironically, not the ones in the region it was originally sold in.

Yeah, I don't know. Anyway, this is how all cinema cameras worked, big or small.

They all used rotary shutters, very much like this one. And as I said, some models were fixed at one angle, typically 180°. But from what I've read,

typically 180°. But from what I've read, the majority of serious professional 35mm cameras offered variable shutter angle, even by the early 50s. Now, the

way this worked is fascinating in its simplicity. The shutter simply had a

simplicity. The shutter simply had a second disc attached underneath the first, and by rotating one relative to the other, you could shrink the size of the gap, thus reducing the exposure time. And this seems like it should just

time. And this seems like it should just solve the problem, right? Just just

reduce your shutter to match the TV and you should be golden, assuming you don't care about a little bit of image tearing. But apparently, it's not so

tearing. But apparently, it's not so simple. And as with many other technical

simple. And as with many other technical aspects of this story, I am not 100% sure why, but let's go back and take a look at that clip from network again. I

quipped earlier that they basically just shot these TVs all natural and didn't bother doing anything about the flicker problem, but I don't think that's actually true. I mean, sure, we see the

actually true. I mean, sure, we see the roll bar, but if we compare to Dawn of the Dead, those roll bars are much wider. They look to be about, well, the

wider. They look to be about, well, the size we saw on my monitor earlier, like a quarter of the screen. So, I'd guess that Romero, like me, was shooting his scene with a 180° shutter, though I couldn't tell you exactly why. It's

possible his camera wasn't adjustable, but it could also be the case that he didn't have enough lighting for a smaller shutter angle, uh, since shorter exposures produce a dimmer picture. In

any case, however, it's clear that they were using a shorter shutter angle in network since the bar is so much narrower, yet it's still bigger than what we see on my camera, and it's bright rather than dark, which tells us

the camera was running a bit slower than the TV instead of faster. And maybe this means they had to use, say, a 160° shutter due to some other aspect of the

scene, like lighting considerations. or

it might mean the camera's shutter wasn't very reliable, which sounds weird, but I have supporting evidence for it. I I found this article in a 1979

for it. I I found this article in a 1979 issue of American Cinematographer that's just jamp-packed with fascinating little tidbits. Uh it's written by an engineer

tidbits. Uh it's written by an engineer named James Mandraa, and he's talking about solving this exact issue when filming the movie City on Fire, uh for a scene where he had, I think, 14 TVs in

in one shot. And this guy sounds like he's either living in 3026 or is a space alien because he goes through all kinds of potential solutions and I only understand some of them. Uh first off,

he says they considered using a burn-in for the TVs, but he felt it was impractical because they had 14 in one shot. And I thought this was an

shot. And I thought this was an interesting comment because he doesn't explain why. Um thinking about it

explain why. Um thinking about it though, I guess you'd have to do 14 passes through the optical printer to add each shot one at a time. uh which

sounds like a huge errorprone pain in the ass and also probably you'd lose quality on on each generation. So that

was part of the reason there. Um next up though he gives a remarkable amount of information about NTSC timing including the fact that it runs at 59.94

and not exactly 60 Hz. But then he goes on to state that had he been working with just one TV, he could have modified the uh VTR, the video tape recorder, to play the tape faster, which would drive

the TV at exactly 60 Hz and then he could have just shot it straight.

You you might notice that he doesn't actually say why that would solve the problem. I I have to assume he means

problem. I I have to assume he means that if he set his shutter angle to 144 degrees, then he'd get a clean image because if that's exactly 160 and the TV is at 160, then you shouldn't get any roll bar. He just doesn't actually say

roll bar. He just doesn't actually say that, so I'm not sure it's what he meant. But either way, he couldn't use

meant. But either way, he couldn't use that solution, and I figure there's lots of productions where it wouldn't be practical. So, let's move on. Uh, the

practical. So, let's move on. Uh, the

next thing he points out is that camera shutters don't open and close instantaneously.

This is an interesting point. Uh, as we established, NTSC video is very close to 60 Hz. So, you'd think if your shutter

60 Hz. So, you'd think if your shutter opening was calculated for exactly 1/60th of a second, you'd get what you've seen on my video here, just the tiniest little roll bar. That might not

even be worth fixing, especially if you're shooting from a few feet away.

But I think the reason it's so small in my shoots is because my camera does have an instantaneous shutter. It starts and

stops recording in precisely one/60.0000 of a second, no more, no less. But with

a mechanical shutter, you get a little bit of extra exposure on either side of that as the shutter is swinging into and out of the frame. That would result in overlap between the scans and a bright

roll bar on the screen. To solve this, you need a shutter opening that's just shy of 144°. So that's what he did. He

went out and got a custom shutter made in a machine shop with a precise 1594th opening. He describes it as equivalent

opening. He describes it as equivalent to 144° but not actually 144° then it isn't. Who talks like that?

Anyway, this all makes sense uh sort of.

But you might be wondering um if that's all it takes and assuming you're working on a big budget feature and thus definitely have a nice high-end camera with all the features, why not just tweak the adjustable shutter to that

precise value instead of replacing it wholesale? Well, according to James,

wholesale? Well, according to James, it's because adjustable shutters didn't actually have reliable timing. He says

that the two parts of the disc jittered relative to one another, so that in practice, the scan bar would flip between being too bright and too dark on successive frames. Now, now this sounds

successive frames. Now, now this sounds like bunk. I can't imagine a $50,000

like bunk. I can't imagine a $50,000 camera having that much slop in its workings. But hey, look, he was

workings. But hey, look, he was president and technical supervisor Sonics International Court, Burbank, California. I'm just some guy, so we'll

California. I'm just some guy, so we'll take his word for it. And to be fair, from what I have learned about adjustable shutters, it might have been pretty hard to tune one that precisely in the first place. Uh, for instance,

the manual for the RE435 says that you could adjust the shutter from 11.2 to 180°, but goes on to say that it only locks at specific preset

positions. Now, I'm not sure if this

positions. Now, I'm not sure if this means that those are the angles it'll slot in at reliably so you get a precise value without this jitter issue or if it literally can't be used at in between

positions. But either way, it sounds

positions. But either way, it sounds like you couldn't dial in a weird fractional angle and trust it. And this

camera was developed in the '90s, so I imagine things were the same or worse in past decades. And indeed, I I think this

past decades. And indeed, I I think this might be what was going on in network.

they had adjusted their camera, but this is as close as they could get it.

Assuming all this is true, if you wanted your shutter to perfectly match standard TV timing, then the only solution was to get a custom one, which most productions obviously couldn't do. But there was a

trick that let you fake it. Uh, by the 80s and and maybe a little earlier, film cameras started including crystal motor drives. uh that is a drive mechanism

drives. uh that is a drive mechanism that was regulated by a quartz oscillator and could run at any desired speed with a very high level of precision. And if you had one of those,

precision. And if you had one of those, then you could configure it to run just a tiny bit slower than it should at 23.97 FPS. And this doesn't actually solve the

FPS. And this doesn't actually solve the timing issue. There'll still be a scan

timing issue. There'll still be a scan bar, but you remember how in my demos the bar slowly crawled up the screen?

Well, apparently if you run your camera at this exact speed, the bar will appear, but it won't move. So then all you have to do is adjust the phase of the shutter until the bar rolls off the bottom of the screen and into the

overcan area where nobody can see it.

This was apparently the budget solution for decades. I've seen references to

for decades. I've seen references to doing this all over the place, but it sounds like there were some limitations to it. Uh, for one thing, it stuck you

to it. Uh, for one thing, it stuck you with a very specific shutter speed, and there were plenty of reasons you might want to use a different one. Not the

least of which that you may be working with HMI lighting. HMIs are a type of arc lamp commonly used on movie sets. I

I think still to this day, and they basically function as incredibly bright, very high-speed strobe lights. Uh that

means they flicker at speeds that are hard for humans to see, but which can show up on film. From what I've read, HMIs are supposed to work at a range of shutter speeds, but they use very weird

and very large bulbs. And apparently

over their their lifespan the output curve can change. Uh likewise if your local power conditions are weird that can also cause like odd ripples in the lighting. So the safe bet per one blog I

lighting. So the safe bet per one blog I read is to always shoot with a 180° shutter because that'll reliably capture the whole light pulse. Now this seems like it should be a very solved problem.

But apparently whoever was doing the photography on Star Trek the Next Generation season 1 either didn't know this trick or had some other constraint that forced them to use shorter shutter angles because all over the place

throughout the first season of the show there scenes with just horrible flicker going on in the background. So I don't know exactly what happened here, but I would guess that he set his shutter to compensate for one problem and didn't

realize that it was causing another. And

this is the trouble. If you set your shutter and your film speed to work with a normal TV, you might do all your shooting for a day, feel like you got it, and then find out in dailies that none of the footage is usable because the lights were flickering. And even if

you didn't have that constraint, the process of phasing a camera to match a TV could be pretty miserable in itself.

Uh those crystal drives did make it a lot easier because you could inch the shutter forward bit by bit. Uh shifting

its rotational phase relative to I guess local power frequency would have been the reference. and then the timing

the reference. and then the timing circuitry would remember the offset. So,

as long as you didn't shut off the camera or the TV, once you had the two in phase, you were good to go. But you

still had to get it dialed in in the first place. And that could be a real

first place. And that could be a real pain, at least in the early days. Here's

another article I found from the same magazine, but written by Victor Keer, the DP on Hot to Trot, an atrocious movie from 1988, in which Bobcat Gold inherits a talking horse voiced by John Candy.

>> I'LL GO GET YOU A BEVERAGE.

>> YEAH, something diet, please.

Okay, Fred, you're the boss.

>> Yeah, I am the boss.

>> This movie is apparently an absolute train wreck. Uh, but I've only watched

train wreck. Uh, but I've only watched the scene where Bobcat and the horse are hanging out in his living room watching TV on a hideous front projection television.

>> Yeah, I almost got married. I was living with this hot blooded Arabian. You know,

>> you lived with somebody? No way.

>> Oh, I loved it and I hated it.

>> What you love about it?

>> Everything. Mhm. Mhm. Well, then what you hate about it?

>> Everything else.

>> So, I wake up. I'm butt naked.

Everybody's looking at me.

>> I tell you, that's the last time I ever drank tequila.

>> Keer says this scene was fishly hard to shoot because they had to illuminate it very carefully. They needed to pour

very carefully. They needed to pour light onto the dark brown horse so he wouldn't just be a silhouette while simultaneously making Bobcat and the rest of the room look natural. And of

course, everything had to revolve around the TV. Since it was a genuine

the TV. Since it was a genuine off-the-shelf consumer model, they were forced to use a 144 degree shutter, which limited their options for exposure control. But beyond that, the process of

control. But beyond that, the process of syncing the camera and TV was pretty easy. Keer says they simply hooked up an

easy. Keer says they simply hooked up an automatic sync unit and it made all the phasing adjustments for him. Now, that

would be something like this. Uh, this

is a little box that connects to your camera. Then it has a sensor that you

camera. Then it has a sensor that you stick on the back of a CRT, which detects the vertical sweep via induction, finds the beginning of the retrace, and then adjusts your camera's motor drive to match that phase

perfectly, shifting the roll bar offcreen. Now, these seem to have been

offcreen. Now, these seem to have been pretty janky project box type products in the 80s, but going into the '90s, you could apparently get nice, polished examples as firstparty camera

accessories. So, this trick got a lot

accessories. So, this trick got a lot easier over time. But before these sync units showed up, it was apparently a real mess. I mean, it's simple enough to

real mess. I mean, it's simple enough to just say, "Well, adjust the shutter till you don't see the roll bar anymore." But

this was sometimes easier. This was

always easier said than done. Going back

to this slow-mo clip, notice that the shutter disc is reflective on top.

That's because movie cameras are basically single lens reflex designs.

When the shutter is closed, it reflects the light from the lens up through the viewfinder so the DP can see where he's aiming. But from what I understand,

aiming. But from what I understand, that's just about all it's good for. By

all reports, film camera viewfinders suck. Uh they're dim to begin with, and

suck. Uh they're dim to begin with, and then they're dimmer and flickery when the camera is running since you're only seeing light from the lens half the time. Uh but that's also the only way

time. Uh but that's also the only way you can hope to see what the CRT is doing because well, when the camera is stopped, you're just you're you're looking straight through it. You're

getting a continuous image. You have to run the shutter so that it chops the image in order to see where the roll bar is. But there's problems with that, too.

is. But there's problems with that, too.

Uh, for one thing, you're not seeing what the camera is because the viewfinder only shows you the picture when the shutter is closed, and that doesn't matter for most subjects, but because the TV picture is continuously

changing, you can't really be sure what'll appear on film until you get it developed at the end of the day. To make

things worse, uh, round about the late '7s, most productions started using a gadget called a video assist, where the viewfinder picture was captured onto tape so it could be reviewed on set without waiting for dailies. And this

was very convenient, but it worked by splitting off as much as 60% of the light from the viewfinder to feed a TV camera. So, the finder got even dimmer.

camera. So, the finder got even dimmer.

And while you did have the video assist to look at, it was a very low res picture and its own camera was shooting at 59.94 hertz just like any other,

which introduced its own timing issues.

It sounds like a real mess to me. Now,

from what I've read, a lot of DPS did manage to get their cameras phased correctly by just looking through the viewfinder. But Keer says that in past

viewfinder. But Keer says that in past jobs before the sync boxes were around, the only solution he had was to remove the film from the camera, put a piece of tissue paper in the film gate to focus

the image on, then stare at it with a magnifying glass while repeatedly hitting record until the shutter landed on the correct phase. Then they had to load the film, shoot the scene, and hope for the best.

So, in conclusion, it was possible to shoot ordinary TVs with an ordinary camera.

Uh, there were lots of options, and they all sucked in one way or another. That

doesn't mean they weren't often usable, but there was definitely room for more flexibility here. And that brings us to

flexibility here. And that brings us to the gold standard and the primary topic of this video, CRTs that run natively at 48 hertz. This solves most of the

48 hertz. This solves most of the problems since it matches the timing of a 180°ree shutter perfectly. You'll not

only get reliable full screen images, you can also trivially sync the TV and camera together so that no shutter phasing is required at all. And that

does seem like a perfect fix, but there are some wrinkles. For one thing, 48 Hz displays just well, they weren't made by anyone. Uh, conventional televisions, as

anyone. Uh, conventional televisions, as well as almost all PC monitors made before the '90s, were all hardwired to run at a single fixed frequency. There

was no way to change it, and it was never 48. Uh for TVs, your options were

never 48. Uh for TVs, your options were uh 60 Hz NTSC sets mostly for North America and Japan and 50 Hz PAL or or I guess CCAM sets for much of the rest of

the world. And that was it. Although PAL

the world. And that was it. Although PAL

is an interesting subject to bring up here. While a 50 Hz display still won't

here. While a 50 Hz display still won't sync to a standard 24 fps camera, it will if you run the camera at 25 fps. Uh

at that point, a 180° shutter will get you a 150th shutter speed, which syncs perfectly. And this was apparently a

perfectly. And this was apparently a done thing for many years, at least for anyone who could do it, which unfortunately usually excluded anyone in North America, uh, for a variety of

reasons, not the least of which that filming with a 150th shutter wouldn't play ball with 60 Hz lighting. Plus, it

would have been a huge pain to keep a ton of PAL equipment around. So, this

trick was more or less limited to the UK and Europe. Uh, but it wasn't always

and Europe. Uh, but it wasn't always straightforward there either. Uh, even

if you shoot at 25 fps, you're probably still going to play the footage back at 24. So, it's going to run too slow, and

24. So, it's going to run too slow, and that's not ideal. However, it's only by about 4%. And well, the footage you're

about 4%. And well, the footage you're watching right now is also running 4% slow, and I doubt you noticed. So, this

makes sense as a solution and was apparently used quite frequently.

However, if you were in the US or just didn't want to shoot off speed, then you couldn't buy a TV that would sync natively, but you could get someone to build it. Uh, I'm not going to tell you

build it. Uh, I'm not going to tell you I understand what's required to change the operating frequency of a CRT, but I know it's just a matter of replacing a few components. I couldn't do it, and it

few components. I couldn't do it, and it certainly requires expertise, but it's neither expensive nor particularly involved. And there were several

involved. And there were several companies working in Hollywood who were buying consumer TVs, modifying them to output at 48 hertz and renting them out to productions. Uh, this solves the

to productions. Uh, this solves the problem for all the reasons I gave.

Though, one new issue is that you then need something to feed into those sets.

If you use a normal camera or VCR outputting at 60, obviously that won't work. So, these same companies had to

work. So, these same companies had to modify that gear as well. And I have no idea what's involved in that, but it's provably doable. And it meant that if

provably doable. And it meant that if you wanted to have something that looked like a normal TV, but was very easy to film, there were companies that could do that for you. All you had to do was take whatever source footage you wanted to

use, have them convert it to their weird video format using sorcery, and hey, presto, away you went. And if I was just here to talk about TVs on film sets, then we might be done right there

because, well, for one thing, I don't have any of those TVs to show you. But

also, those two solutions pretty much cover the gamut. If you're filming a CRT, you can either adjust your camera or adjust the tube. That's how you do both those things. But this video is

focused on computers. So, let's get back to that. While it's relatively trivial

to that. While it's relatively trivial to modify a TV or VCR, computers are much more involved. So much so that anyone you saw in a movie from the 80s or '90s very likely wasn't a computer at

all. A great example is this scene in

all. A great example is this scene in Blade where the villain is pacing back and forth in front of his computer, which is clearly just an off-the-shelf consumer television from Sony that they

Greek by ripping off the logo, leaving a clearly legible silhouette behind. Now,

I have a lot of things to say about this shot, and we'll come back to it later, but the point is this is obviously not a PC, and it's not trying to look like one. It's one of those Hollywood

one. It's one of those Hollywood computers that we all know and love. But

I I think this partly explains why those exist. This prop isn't trying to pass as

exist. This prop isn't trying to pass as a genuine PC, and that means it doesn't need to appear on something that looks like a real computer monitor, which is extremely convenient for VFX and set

artists since they could just rent a modified 48 Hz TV and VCR and be done with it. But often you do need something

with it. But often you do need something that looks like the genuine article, and that could be a problem. A few days after I received my 48 Herz PCs, by total coincidence, Ron's Computer Videos

made an upload called How the Macintosh Plus helped make Star Trek History. And

at this point, I figure half my viewers have seen it. Although, if you haven't, you should, cuz it's great work. Uh, Ron

went to the effort of tracking down some of the people involved in the production of Star Trek 4, The Voyage Home, and got some history from various horses mouths about one of the effects scenes, which is much better work than I usually do.

So, do please watch that video to get the whole story. But here's the cliff notes. You know that scene from Star

notes. You know that scene from Star Trek 4 where Scotty designs a molecule for transparent aluminum on a Mac Plus?

Well, to no one's surprise, he's not actually sitting in front of a Mac Plus.

You probably guessed that just from our inherent distrust of Hollywood. It's

Tinsel Town, baby. Nothing's real. But

the specifics are gruesome. That was a Mac, a brand new one, in fact, which the VFX team gutted. Uh, which you can actually see if you look closely.

There's nothing behind that floppy drive slot. Uh, they took out everything

slot. Uh, they took out everything except the CRT, then added an external jack that connected to a VCR on a table somewhere, and everything the computer seemed to do was pre-recorded. Scotty

was typing on a disconnected keyboard while a bunch of animations played on the screen that weren't really coordinated to his actions. And the

result is pretty silly looking, but given the goofiness of the whole movie, it's convincing enough. Now, Ron's video focuses entirely on what was done to

achieve this. Uh the VFX company Video

achieve this. Uh the VFX company Video Image installed a CRT assembly that could refresh at 48 hertz and provided a customized pneumatic VCR that output video at the same rate. And we don't

know exactly how they achieved that. The

staff didn't give thoroughly detailed answers, but it sounds pretty straightforward. Uh for instance, uh one

straightforward. Uh for instance, uh one guy states that they changed the cap stand on their deck. That's the

component that grips the tape and pulls it through the machine. So using a smaller diameter spindle would move the tape slower. And that feels like it

tape slower. And that feels like it wouldn't be adequate. Surely, you'd also have to slow down the head rotation, but maybe not. Maybe it really was as simple

maybe not. Maybe it really was as simple as changing a roller. I mean, VCRs of that era were pretty dumb devices. So,

this all seems straightforward, but there's an unanswered question buried in there. Why do all this instead of just

there. Why do all this instead of just using a real computer and adjusting their shutter timing? Well, there is a partial answer in Ron's video. Michael

Cuda, who famously did graphic design for many Star Trek series and movies, asserts that Macs were notoriously difficult to film. And there's no further info on what that means, but it doesn't take much research to figure out

what he's probably talking about. You

see, we're used to modern computers functioning at seemingly any refresh rate you choose. I mean, hell, my machine at home will output 240 hertz with the right display, but freely adjustable refresh rates basically

appeared in the '9s. Virtually all

hardware in the 80s was hardwired for one maybe two preset scan rates and that was that. And for PCs, this wasn't

was that. And for PCs, this wasn't always a problem since the most common frequency they used throughout the 80s and sort of into the '90s was 60 Hz. But

in true Apple fashion, the first few models of the Macintosh ran at 60.15.

Is it impossible to film this? Probably

not. But at that point in time in particular, it would have been incredibly difficult. Uh, in fact, I was

incredibly difficult. Uh, in fact, I was so suspicious of this bizarre number that I had to take a field trip to REPC, my local used computer store, uh, where I dug up one of these machines and attempted to film it. Uh, but sure

enough, even with a 1/60th of a second shutter, I got a prominent white roll bar. And I'm sure if my shutter acted

bar. And I'm sure if my shutter acted like a real one, that would have been a lot wider. Now, a few days later, I then

lot wider. Now, a few days later, I then discovered that my Blackmagic camera supports fractional shutter speeds. I

had not known this the whole time I owned the thing. Uh, so I went back to the store, found the same Mac, and did some more experiments, and I found that the angle I needed was 143.6°.

Dialing that in on a real mechanical shutter would probably be fishly difficult, if you could do it at all. I

mean, there's not going to be a detent for it. And even if you do nail it,

for it. And even if you do nail it, you're probably going to get that funky jitter effect that we were hearing about. So yeah, I can imagine that the

about. So yeah, I can imagine that the Mac was notorious among DPS. In

practical terms, it would have been impossible to film. But but hang on a minute, because if you can modify a TV to run at 48 Herz and you can apparently modify a PC as well, as we saw earlier,

then couldn't they have just hacked the Mac? Well, I can't speak to this

Mac? Well, I can't speak to this authoritatively, but the impression I get is no, not even a little bit. From

what I've read, that weird 60.15 number was arrived at as a product of the machine's speed. like the whole machine.

machine's speed. like the whole machine.

The video hardware was designed very closely around the CPU, the RAM, the IO bus, etc. So, if you wanted to alter its output, I think you'd have had to basically reverse engineer and rebuild

the entire system, which was hardly worth it for a single shot lasting less than 2 minutes. And that is a point worth dwelling on. If this were a movie all about computers and they showed up

in almost every scene, then maybe it would have been worth it to investigate some kind of more authentic solution.

And we'll discuss good reasons to do that later. But since it's a brief

that later. But since it's a brief one-time gag, the expediency of the videotape approach was worth it. And it

came with other benefits. Uh, for one thing, Scotty's Mac definitely didn't have the power for this shot. The

individual frames could be generated on a real machine, but doing it this quickly would require either enough CPU power to render it all on the fly, including the wireframe 3D, or enough memory to store all the pre-rendered

images and call them up in rapid succession. Now, an SGI or Intergraph

succession. Now, an SGI or Intergraph workstation might have been able to do all that stuff even in ' 86, but the Mac Plus was a slow, dinky toy for home gamers, so it was out of the question.

And even if they could have used an SGI or inergraph, which after all would have looked a lot more authentic, it still would have married the production to a computer, a decidedly unpleasant

prospect at the time. And this actually came up in another Star Trek related context. When TNG was first being

context. When TNG was first being developed in the mid 80s, they actually considered using CG for the ships instead of models. And of course, we're all very glad they didn't because the model work ended up being beautiful and

standing the test of time, but the reasons for their choice aren't wholly what you'd expect. I mean, sure, the quality of the images wasn't great. The

demo footage they got from various VFX houses was maybe okay at the time, but it looks embarrassing now, and the producers correctly recognized that it would date the show. But an equal

concern was that the state of computer graphics was very ad hoc in those days.

Uh what CG existed was being made by experts building their own hardware and software. All of it totally custom. So

software. All of it totally custom. So

Paramount knew that if they ever had to switch vendors, they wouldn't be able to bring anything with them. None of the file formats would be compatible. And if

their provider went belly up, as so many did back then, they'd just be so with practical effects. However, um even if

practical effects. However, um even if your whole team quits, you still have the models. And even if you don't,

the models. And even if you don't, there's lots of people who know how to build new ones. So, while we aren't talking about 3D here, involving a computer on a movie set in 1986 still

added the worries of crashes, hardware failures, but more importantly, consultant entanglement. Not only were

consultant entanglement. Not only were there not a lot of companies around that could have picked up the slack if their vendor went bust, but the work itself would be a lot more complicated. uh say

they'd asked for the animation to pause for 8 seconds at the beginning uh and then realized they needed 15. Well,

fixing that might involve a big complicated process of getting the tech in, explaining the issue, the tech goes back to their studio to work on their non-portable development system. Then

they come back, they demo the changes, they don't get it right, they have to do it all over again. It could take days.

But with videotape, I I mean, come on.

It's 1986. You've got like 50 people standing around the set who know how to work a VCR. You can just pause it. And

and I think that's what they did. Uh see

when Scotty's about to do his thing, look how the video tears. That's someone

hitting play. And maybe that was a hired tech or maybe it was just Steve. Anyone

can press play. So CRTs and videotape benefited from being known values that fit the existing production environment and they could produce perfect results.

Consider that even if they got the shutter speed and phase dialed in perfectly for a real Mac, the software itself couldn't necessarily update the screen at full speed. It took time to

draw complex images on computers of that era. But if we step through the shot in

era. But if we step through the shot in Star Trek, every time the screen updates, it's a clean cut from one image to the next. No tearing. This reads much better on screen, and it would have been

impossible on just about any real computer. So videotape is a great

computer. So videotape is a great solution. It's clean, cheap, simple. It

solution. It's clean, cheap, simple. It

integrates into standard production workflows. It seems like there's no

workflows. It seems like there's no reason to do anything else. And my

understanding is that it was extremely popular well up into the 90s. But around

that time, I think things started to swing back towards real computers. I

mean, obviously, we know that for a fact since I have some of them, but Sparkology was a very tiny company that didn't work on many productions. Uh, per

their website and what I can find on IMDb, it was like a couple dozen movies and shows. Video Image on the other

and shows. Video Image on the other hand, uh the folks who did Star Trek 4 were a pretty big operation. Um they did work on Predator, the Abyss, Weird Science, and lots of other stuff. And

per this document from one of the folks that Ron spoke to, by 1994, they were offering both PCs and Macs for onset use. So things had clearly changed quite

use. So things had clearly changed quite a bit. Although I would guess that by

a bit. Although I would guess that by this point computers had even started to look better than tape. For one thing, they'd gotten much faster in 8 years.

So, they were capable of generating satisfactory on-screen graphics on the fly, and they could do it at 48 hertz with little more than a $2 hardware mod.

But even better, that was the only mod you had to make. You see, when I demoed the Sparkology machine earlier, the monitor I was using wasn't one that came with it. It's a 1996 Nokia that I picked

with it. It's a 1996 Nokia that I picked up at the e-way store, so it's bone stock. Yet, it just worked. Uh, as did

stock. Yet, it just worked. Uh, as did almost every other monitor I tested. Uh,

I've got one from 1995. I've got one from 2001. I've been through like five

from 2001. I've been through like five or six monitors and they all work just fine at 48 hertz, even though none of them say they'll do this in their documentation. This um this seems like

documentation. This um this seems like kind of a wacky thing to discover, but at the same time, it's not too surprising because computer monitors had gotten an awful lot smarter by this point. In previous decades, they were

point. In previous decades, they were pretty much just like television sets.

In fact, a lot of the earlier ones were literally based on NTSC television designs. So they were all hardwired for

designs. So they were all hardwired for a single scan rate or maybe two on a good day. A very rare few supported

good day. A very rare few supported three modes, but that was exotic. Near

the end of the decade, however, we saw the appearance of multiscan displays such as NEC's multi-sync line, and those had no set modes. They would display anything they received as long as it

fell within a given range of frequencies. uh that Nokia for instance

frequencies. uh that Nokia for instance will do 60 Hz, 72, 75, 85, 90, 100, even 120 hertz and and probably many things

in between if my graphics card would output them. Now, it seems that some

output them. Now, it seems that some monitors did have a hard lower cap of 60 Hz. And I want to say for some reason,

Hz. And I want to say for some reason, but really, I'm not sure why any of them will do this. Uh, like I said, the documentation for no monitor that I have seen says it will scan down to 48 hertz.

And it seems like it would have required specific engineering decisions uh to allow it. Yet, I'm positive no monitor

allow it. Yet, I'm positive no monitor company ever intended to support this.

So, why does it work? I don't know, but the proof's in the pudding. Uh, I can't say that most monitors would do this, but it clearly wasn't hard to find off-the-shelf displays that would, and

that's really exciting. Like it must have been a huge pain in the ass for big Hollywood productions to source a dozen modified TVs and VCRs for big room filling scenes. But here you only needed

filling scenes. But here you only needed modified PCs. And sure, if you wanted 20

modified PCs. And sure, if you wanted 20 distinct displays, then you'd need 20 modified computers to drive them. But

suppose you were doing an office scene where everyone's monitor has a static screen saver up. You know, it's a pretty common site in '90s movies. Well, that

could consist of a single 48 Hz modified PC and then 20 or 40 stock monitors all fed uh through a distribution amplifier.

So, right off the bat, we have a pretty appealing advantage. And there's more on

appealing advantage. And there's more on the table, or at least that's that's how I feel. I I have to admit, I haven't

I feel. I I have to admit, I haven't really found any treatises on this subject from its heyday. Uh other than that one document from Video Image, I couldn't come up with any testimony about the pros and cons, but I feel like I have a pretty good grip on it just

from my own perspectives. So, I'm just going to give you my opinions. And the

first one has to do with image quality because TVs looked awful.

Let's not mince words here. Most

computers in movies are in the background. So, their contents barely

background. So, their contents barely matter. And when they are shown in

matter. And when they are shown in close-up, it's often very brief and frequently there's just one bit of information that's important, uh, like a password dialogue or hack incoming. The

audience just needs to know what the computer is doing in the broadest sense.

Characters will usually fill in the important details out loud. And this is partly why Hollywood computers have always looked ridiculous. Most of the information that you get from a real computer is in the form of 12point text

that's only legible from 1 ft away and often surrounded by hundreds of other visual details which can be distracting to an audience. When you're going to show a screen for 3 seconds, you need

viewers to immediately notice what you're trying to show them. And you can just overlay a big flashing message on top of a real piece of software, but everyone knows that's not what computers

look like. So, if you're going to do it

look like. So, if you're going to do it anyway, why not just whip up something totally artificial in Photoshop?

Something that doesn't try to look real and then fail at it and also draws the audience's eyes to the point you're trying to make. That and probably some complicated intellectual property

nonsense are why these UIs are so commonplace, but it mostly works for really high concept stuff. If you're

making a Bond movie, it flies, but if you're doing something a bit more grounded, it's a tougher cell. Take the

computers in office space for instance.

They look exactly like real computers, which is to say they look like all of them at once. I'm not sure this was meant as a joke by the VFX people, but I strongly suspect it was. Peter's PC in

this movie appears to be running every operating system on the market simultaneously. It's Mac OS 9 on the

simultaneously. It's Mac OS 9 on the outside, then another Mac OS 9 in a window, then that's running a borderless copy of Excel with the Windows 95 uh

toolkit. Then that's showing a graph and

toolkit. Then that's showing a graph and a Unix style motif window. Then we cut away for a second and when we come back, it's a different monitor, still OS9, but now Excel is in a normal Windows border

and the graph and data are in OS 7 borders. Then when the PC finally shuts

borders. Then when the PC finally shuts down, we get a splash that reads like some kind of embedded firmware and a PC style command prompt. It is the most baffling Hollywood computer I've ever

seen. And to its credit, I've seen this

seen. And to its credit, I've seen this movie two dozen times, and I never noticed any of this until right now when I deliberately analyzed it for this video. The only thing I noticed over the

video. The only thing I noticed over the last 25 years was the comically oversized, obnoxiously blue progress bar because that's the only part of the scene that contributes to the narrative.

My eyes were drawn to that element alone, exactly as intended. But for it to feel convincing, it had to look like part of an otherwise believable computer. The details are obviously not

computer. The details are obviously not important as we've just observed, but there do need to be details because, well, let's look at what happens when

there aren't. Remember this scene from

there aren't. Remember this scene from Blade that I mentioned earlier? It

fascinates me simply because the computer screen looks way worse than it needed to. Now, to be fair, this scene

needed to. Now, to be fair, this scene does have dialogue that explains what's going on. So, the monitor isn't the

going on. So, the monitor isn't the hero. You don't have to work out the

hero. You don't have to work out the narrative from its contents, so they don't need to be perfectly sharp. Or at

least that was true until they decided to do an agonizing 15-second zoom ending in an extreme close-up during which any movie goer would have noticed how blurry this is. Your first thought might be

this is. Your first thought might be that this was a focus pulling issue, but since we can see the aperture grill, that is the little black lines between the pixels, that's not it. The signal

itself is definitely at fault. And I'd

guess it's plain old composite video, judging from the dot crawl artifacts.

Notice the kind of checkerboard pattern flickering back and forth. Yeah, you you get that on composite. You know, the yellow plug. And this is kind of gling

yellow plug. And this is kind of gling given that even a totally conventional consumer TV should have had an S-video jack at this point and it wouldn't have had that problem. But I digress.

Composite is objectively the worst way to transmit video, but it isn't necessarily damning. It is possible to

necessarily damning. It is possible to make graphical interfaces that look convincing over this. You just have to design for the medium. Here, they didn't do that. Fully zoomed in, it's obvious

do that. Fully zoomed in, it's obvious that these vertical lines were meant to be solid. Instead, they're smeared in

be solid. Instead, they're smeared in all directions, and that tells us this has been rescaled. Uh, my guess would be that the animation was designed at much

higher than TV res, maybe 1024 x 768, then naively downscaled with a cheap presentation scan converter or a graphics card's built-in TV output, which I say because I tried doing the

same thing back in the day, and this is exactly what it looked like. At age 12, I was very excited at the prospect of using the living room TV as a 36-in monitor until I plugged into it and got

these exact results. So, I strongly suspect something went wrong here. There

was uh some kind of production snafu. A

memo was missed. A piece of gear broke down. Last second decision was made.

down. Last second decision was made.

They had one day left to shoot before leaving a country. Who knows? But I

suspect someone had to rush to bang this together at the last second with the result an embarrassingly ugly picture by anyone's standards. So, what could they

anyone's standards. So, what could they have done instead? Well, option one was to master the original graphics at the intended resolution. Had this been drawn

intended resolution. Had this been drawn with a standard deaf TV in mind, it would look perfectly reasonable even to most nerds. But if all the work had

most nerds. But if all the work had already been done at high-res, it all have to be redone, which sucks. And the

result would still look a little goopy this close in. And that also sucks given that by this point, the average audience member probably knew how sharp computer monitors were. This brings us to option

monitors were. This brings us to option two. Uh, by this point, there weren't a

two. Uh, by this point, there weren't a lot of downsides to just using a real computer monitor. They were readily

computer monitor. They were readily available in sizes up to at least 21 in.

So, they were big enough to be legible.

They were cheap as hell. Uh they clearly could sink at 48 hertz as we've seen.

And they looked fantastic. A funny thing about conventional TVs is that they continued to advertise high resolutions well into the 2000s, even though that barely meant anything since the

underlying video standards hadn't changed since 1953.

A circa 97. Sony would happily sell you a professional video monitor like the PVM20 M4U on the basis of extremely high resolution. A number that turned out to

resolution. A number that turned out to be 800TVL and that's TV lines, a kind of murky analog concept we won't get into, but at

best you could interpret it to mean that this 20-in display could render 800 pixels across its width. Not that you could really feed it that easily since almost nothing produced that much detail

and vertically you were still limited to the roughly 480 lines baked into the NTSC standard. So despite all this

NTSC standard. So despite all this huffing and puffing, it just looked like a normal TV. In the PC world, however, you could beat a TV with just about anything. Companies were selling

anything. Companies were selling displays in the late8s that would happily do 1024 x 768 and by 97 a 20-in display could push over 1,600 x 1200.

two and a half times more detail in both dimensions and sharp as attack even up close. The refresh rate was adjustable

close. The refresh rate was adjustable and it wasn't interlaced. A point I must finally grudgingly address.

So interlacing is a technique used by older video gear to reduce the amount of bandwidth needed to transmit a picture without reducing its perceived resolution. The basic idea is that every

resolution. The basic idea is that every time a TV screen is refreshed, rather than drawing a whole picture, it only draws the odd lines or the even ones.

Then on the next refresh, it paints in the other set. These sets of lines are called fields. And every time I've

called fields. And every time I've quoted you the refresh rate of a TV, it was the field rate I was talking about.

A TV doesn't truly paint the whole screen in a 60th of a second. It draws

half of it, one field, and then as soon as that's done, it draws the second field. This approach cuts the bandwidth

field. This approach cuts the bandwidth of the signal by 50%, because you only need to send half as much image data in a given time frame. Yet, it looks like a full screen picture partly because there

is image data covering the whole area of the screen and partly due to persistence of vision. First, you see one set of

of vision. First, you see one set of lines, then the second, but your visual system is still seeing the first one, so you get the full effect. This made

television possible, but it still sucks in a lot of ways. Uh, for one thing, people tend to misunderstand what's really going on here. It's easy to imagine that the TV draws the first half of an image and then a moment later it

draws the second half. But that's not true. In reality, it draws half of one

true. In reality, it draws half of one image and then another half of a different image. A TV camera doesn't

different image. A TV camera doesn't just take a picture and send it in two chunks. It takes one picture consisting

chunks. It takes one picture consisting of just even lines, and then a 60th of a second later, it takes a new picture containing only odd lines. So even if you could put the two together, you wouldn't end up with a single coherent

picture unless the scene was absolutely static. If anything was moving, then you

static. If anything was moving, then you get combing artifacts where the old and new locations of objects are blended together. Now, in most circumstances,

together. Now, in most circumstances, this is confusing, but not necessarily harmful because the pictures moving so fast that we don't really have time to stop and stare at any given frame. And

our vision is so objectively mediocre that our brain spends most of its time making up half of what we think we see anyway. So, it works fine for us and in

anyway. So, it works fine for us and in particular it works fine for natural color images, you know, photographs like this. But for cameras and for computer

this. But for cameras and for computer graphics, it's a different story. When I

show you a TV screen at 60 fps, it looks pretty good at a glance. But if you get up close, you find out the picture is actually jittering up and down rapidly as it switches from one field to another. Now, if you've never watched

another. Now, if you've never watched ordinary TV content on a CRT, let me tell you, it really is visible in person. like it's a bit easier to ignore

person. like it's a bit easier to ignore thanks to all our cognitive weirdness, but you definitely can see it if you want to, and you surely can see it here.

But since it is rapidly flickering back and forth, even the version you're getting on your LCD or OLED probably looks good enough. Now, if we switch the camera to 30fps and use a 1/30th of a

second shutter, suddenly it looks a lot more stable because we're getting both fields combined on one picture. This is

technically incorrect. It's not how you'd ever see a TV in real life, and it doesn't look great when viewing scenes with high motion, but it's passable, and in largely static pictures, it looks

terrific. However, if we stay at 30 fps,

terrific. However, if we stay at 30 fps, but switch the shutter to 1/60th, half the lines just straight up disappear.

And this is a problem. Think about

what's going on here. Within each 30th of a second interval, the TV draws two fields of video. So, our camera's shutter opens, one field appears on the screen, then the shutter closes, and the

camera sees nothing until the next 30th second interval comes along, and in the meantime, a whole second field is drawn to the screen, then disappears, all while the camera is totally blind to it.

So, it's like that field never appeared at all. The result is a phenomenon retro

at all. The result is a phenomenon retro gamers have been calling scan lines for 40 years, though, for a totally different reason. Game consoles all use

different reason. Game consoles all use a non-standard video mode referred to as 240p, where the picture looks like this, even to the naked eye, because the console simply doesn't send two fields

of video. Instead, every time it outputs

of video. Instead, every time it outputs a new frame, it tells the TV, "Here you go, another set of even lines. It just

sends the evens over and over, and the TV happily accepts it, resulting in half the lines on the screen never getting painted. though they remain black while

painted. though they remain black while the other half update at the full 60 Hz frame rate, which should also never happen on a normal TV, but who cares?

This allowed game consoles to use half as much resolution for their graphics, making them a lot cheaper and faster, with the result that anyone who's only ever used a CRT for retro gaming probably thinks TVs just had big black

lines on them for some reason. Here,

however, both fields are being populated. I could see them both with my

populated. I could see them both with my eyes when I was shooting this footage.

And in fact, if you watch closely for a few seconds, you can see a rolling line where the field order switches and you can now see the other set of lines.

Again, because the camera and TV speeds aren't a perfect multiple, the phase slowly rotates as we shoot. And the even lines switch to odd ones or vice versa.

Another good reason to properly sync your camera and TV. But while I'm doing this demo at 30 fps for simplicity sake, because it gets me the cleanest picture, all this stuff still applies at 24 fps.

If you're shooting an unmodified TV with a 1/60th shutter, you're still going to get this effect. And even if you have a 48 Hz TV, that's still just the field rate. That's how often it draws one set

rate. That's how often it draws one set of lines, not both. So, this problem is still going to occur even with the gold standard solution. And this actually

standard solution. And this actually came up when Ron was talking to the video image guys. Uh, one of them, John Wash, observed that they had to double up on the graphical details for them to be visible. And it's probably evident

be visible. And it's probably evident what he means at this point. While the

pixel resolution of a standard TV image is generally considered to be 720x480 as established by the S&P D1 standard in 1986, that 480 pixel height includes

both fields. Well, if you're only going

both fields. Well, if you're only going to be seeing one in your finished movie, you can't really use all that. Just like

game consoles, you're effectively limited to a 240 pixel high canvas. Now,

for natural color images, you know, photographs like this, that's not that big a deal. I I mean, none of us had any trouble back in 1995 understanding what was going on in the Buddy Holly music

video, but also uh if we take a look at this scene in Independence Day, uh the TV has prominent scan lines which make it dimmer than it could be. But

otherwise, it looks fine and nobody would ever notice anything a miss. For

computer interfaces, however, every pixel counts, especially in the 80s and 90s when PC graphics were full of single pixel details. um the borders on windows

pixel details. um the borders on windows and buttons, the crossbarss on fonts, all sorts of things were just one pixel high. And when you view those details

high. And when you view those details through interlaced video modes, like those used by the AmIGGA's high-res mode or IBM's wacky 8514 card, you can see exactly what problem this causes. When

shot at an ideal frame rate, half the UI disappears over and over. And when shot at a practical motion picture frame rate, half the UI disappears for good.

This is obviously intolerable, especially because in most cases, even if you got your camera synced to the TV, you didn't know which of the two fields you had synced to. You might get odd, you might get even. There was no way to

know. And that meant the only way to

know. And that meant the only way to ensure your graphics would appear was to draw everything two pixels tall. In

other words, stretching the picture 50% vertically. This is a minor pain in the

vertically. This is a minor pain in the ass in reality. Mostly though, the problem is just that you had to do all your art on a 240 pixel canvas, which nobody wants to do. I mean, even the

horizontal res ain't great. I mean, 720 pixels is usable for many purposes, but if you want an extreme close-up, the audience is going to notice how blurry it is. And to avoid all this, you could

it is. And to avoid all this, you could just switch to a PC monitor. Since those

were all progressive, you didn't get any scan lines, regardless of which technique you used, whether you carefully set your shutter to 1/60th or ran the display at 48 hertz. Either way,

the screen would always update in its entirety every time the shutter was open. And this to me seems like it would

open. And this to me seems like it would have been a selling point even if you didn't care about high resolutions or special computer features. I mean, sure, a computer monitor could run at 1024 by

768 or higher, and that would let you do extreme close-ups that looked fantastic, but even at 640x480, you were getting double the vertical res of any television and without any scan lines.

So, the image was twice as bright. Even

if you weren't trying to do computer style graphics, a computer might still be the best way to display any kind of picture. Consider by 1997, computers

picture. Consider by 1997, computers could decode full motion standard defaf MPEG2 video in real time and deinlace it for progressive scan. So if you had a

video clip you wanted to play on set, you were actually better off playing it on a PC since you'd get both fields in a brighter and sharper picture. So, if I was in the game back then, I think by the late 90s, I'd be pulling tubes out

of computer monitors and putting them in TV chassis rather than vice versa. But

in any case, I think I've made my point.

The increase in fidelity offered by computer displays was immense. And as we went through the '90s, this became increasingly important. That scene in

increasingly important. That scene in office space just wouldn't have worked on a television. But even the fake computer and blade would have looked a thousand times better on a real monitor.

So, I think this makes it worth it all on its own. But there are other advantages to using real computers which are best explained by showing you what one of mine was actually used for. Yeah,

remember the Sparkology GPC? I know it's been like a week, but it's back. Like

McRib. Now, you've already seen this thing's flagship trick, the 48 Herz display mode. And I want to move on and

display mode. And I want to move on and show you what was done with that specifically. But first, I'd like to

specifically. But first, I'd like to make an important point. You may recall earlier I claimed that the video hardware in the Mac Plus was built very tightly around the rest of the machine and that that was actually true of most

80s computers, but by the mid90s it wasn't anymore. PC and Mac video

wasn't anymore. PC and Mac video hardware had become heavily abstracted from the rest of the system. Graphics

cards had their own processors that basically worked independent of the rest of the machine with the result that most software by this point was agnostic about your actual display mode. uh

whether you were running at 48, 60, or 120 Hz, pretty much all programs would work just fine without modification.

This has a number of cool advantages. Uh

for one, if you're in this business, you can develop all your visual effects on unmodified PCs. You only need the 48

unmodified PCs. You only need the 48 hertz gear for the actual onset playback. So, that's convenient. But

playback. So, that's convenient. But

second, and much more significant, you get instant access to all extant PC software. Any off-the-shelf program will

software. Any off-the-shelf program will work just fine on this machine. And that

isn't even limited to Windows apps.

As you may know, several standard VGA modes ran at 70 hertz, including text mode and the 320x200 256 color mode used by most successful DOSs games. It is

theoretically possible to adjust a variable shutter to fit this speed, but no movie camera was going to have a detent for it. So, good luck filming anything under DOSs on an unmodified PC

because it's going to look like this here. However, if we quit, go to the

here. However, if we quit, go to the frames directory and run 48 hertz F2, then go back to Doom. Now, it looks like

a million bucks. I mean, it doesn't run great on this 486Sx, but uh that's just normal for Doom. It was pretty power hungry. Doesn't run any worse than it

hungry. Doesn't run any worse than it did before. And visually, it's flawless.

did before. And visually, it's flawless.

Like, don't get me wrong, there is actually some jutdder going on here. Um,

when I shot test footage of this thing, it seemed to be repeating frames here and there and a basically random cadence, but that's pretty normal on any underpowered CPU anyway. And at a

glance, you'd never notice it. The game

looks totally normal, particularly when you film it. And I have to admit, this floored me. It may not seem remarkable

floored me. It may not seem remarkable since we already saw the 48 Herz trick under Windows. But that was using a

under Windows. But that was using a custom graphics driver. So I assume the vendor had APIs for adjusting the refresh rate. And in a graphical OS,

refresh rate. And in a graphical OS, applications are insulated from details like refresh rate by the OS's hardware abstraction layer. This however is a

abstraction layer. This however is a very different situation. Like most DOS games, Doom speaks more or less directly to the video hardware. It does use a BIOS call to set the video mode, but

after that, the game is just drawing straight into VRAMm and using the VSSync signal straight from the card to time itself. So, for one thing, I was

itself. So, for one thing, I was surprised the refresh rate could be adjusted at all. That wasn't typically possible with legacy video modes. So,

I'd guess this card has some vendor specific register you can set to change the behavior of the standard modes. Uh,

the utility I ran to enable this probably injected a replacement for the video mode BIOS call so that when programs ask for a given mode, it sets special registers before switching to

it. And that all makes sense, but it

it. And that all makes sense, but it still feels weird. Uh, I also kind of thought we'd end up with odd pacing issues since the game's video subsystem at least is timed off VSYNC and the card

is running almost 50% slower than it should be. But in practice, it it all

should be. But in practice, it it all just works. I mean, you wouldn't really

just works. I mean, you wouldn't really want to play the game like this if you could avoid it, especially cuz in person, this monitor is like staring into a strobe light. But, um, you could,

and that should make my point. This

machine will run any unmodified PC app without a problem, and that makes it even more useful than you might have thought. Sure, you can use it for, you

thought. Sure, you can use it for, you know, silly hacking sequences and action movies, but I'm pretty sure some people were still filming documentaries on film in the '90s. And if they concerned any

kind of computer software, then running that on a machine like this would make it much easier to film, especially on a low-budget camera without an adjustable shutter. Did that ever happen? I don't

shutter. Did that ever happen? I don't

know, but it's neat that it could have.

And what I can tell you for sure is that this machine's ability to run standard DOSs apps was used in several movies and TV shows, and I can prove that. Though

I'm actually going to lose this monitor first since at this point you have seen pretty much all the machine's capabilities and it'll be a lot more pleasant to just look at a digital capture. We're going to start out by

capture. We're going to start out by going to the fakey folder. We type 39A and we get a program called Defo Tech Defyper 2. Notionally a TDD app that

Defyper 2. Notionally a TDD app that allows deaf folks to make phone calls.

Uh but this is not of course a real piece of software. It's a scripted sequence created in an app that I believe Sparkology made in-house called Fakey. And this is a great segue into

Fakey. And this is a great segue into one of the major advantages of using real computers in your movies. They can

be interactive. Going back to Star Trek 4 yet again, remember my complaint that Scotty's typing doesn't correlate to what happens on screen? Well, obviously

that wasn't entirely fixable. Jimmy Dan

surely didn't know any of this math. Uh,

no real software works that way. And no

matter how good a typist you are, as soon as a camera starts rolling, you turn into a hunt and peck grandma with a 75% error rate. Ask me how I know. But

all that doesn't mean that a computer couldn't have made it look like he was doing those things by simply advancing an animation every time you pressed a key, regardless of which key it was. And

that's exactly what this software is for. This script will just sit here

for. This script will just sit here indefinitely until I type something. So,

if we want a long establishing shot or a slow dramatic push in on the screen, we have all the time in the world to do it.

And then when I do start typing, no matter what I press, the numbers 911 come out. I then have to press enter and

come out. I then have to press enter and it says we're calling. And then we get ringing. And a moment later, we get an

ringing. And a moment later, we get an answer from a 911 operator. And once

again, it waits here indefinitely until I start typing. And then no matter what I press, I end up entering help killer in house. So yeah, if you hadn't picked

in house. So yeah, if you hadn't picked this up already, this software was used in the 1996 movie Scream for the scene where Sydney calls 911 on her computer.

There's no question if we put them side by side, this is exactly the same app, and Marty Brenice is credited on the film for 24 frame services. So this is probably the exact machine they used,

which is a neat movie connection, even though I've never seen that one. Also,

very little of this sequence was actually used, and that's a theme that'll continue throughout the rest of this video. In the finished film, we do

this video. In the finished film, we do see the dialing process, and the operator picks up, but then Billy climbs in through the window, and the call is forgotten. So, the rest of the script

forgotten. So, the rest of the script never got used. And in fact, all the shots are so tight that we can't even see the whole screen. So, not even the complete design ended up in the movie.

Though, this does make my point about fidelity. They're zoomed in so far that

fidelity. They're zoomed in so far that you can tell this is a shadow mask rather than an aperture grill tube, yet the text is still legible. Computer

monitors, I'm telling you. So, anyway,

this only plays a minor role in the film itself, but it is definitely the same program. And I even have two different

program. And I even have two different versions of it. Uh 39A is actually a demo uh presumably made when the production was shopping around for VFX consultants. And the reason I picked

consultants. And the reason I picked that one is because of the big honking watermark that identifies it as scary movie, which I hadn't known was the working title of the film. Uh, but then I have a second copy with no watermark

and slightly different verbiage that fits what's actually seen in the movie.

So, I'm pretty certain uh that was the final product and this was the computer that displayed it. Um, that watermark kind of amuses me though because sure it does include a little tutorial message, but I have the feeling that the real

reason it's there is so the studio couldn't take their screen test footage and just use it in the film without paying because, you know, Hollywood. But

after I thought about that for a bit, I realized the shots in the movie are zoomed in so far you could almost think they did use one of the versions with the watermark and just pushed in close enough to hide it. I mean, probably not,

but it's funny to think about. Anyway,

the value of this software seems enormous to me conceptually speaking.

Uh, for one thing, it only advances the sequence when appropriately cued, which gives a lot more flexibility in shooting, but also it allows the computer to look more natural. Human

brains recognize correlations. So, if

letters appear on screen only when the actor presses keys, their actions will feel more connected to events. But also

if the actor is able to emote to express their state of mind by typing faster or slower or hesitantly or frantically that too will sell the feeling better and you

need an interactive prop to make that possible. Now in this case the

possible. Now in this case the production didn't end up picking a shot where you can see the actor typing but because the prop was built with that capability the option was there. There

may be footage somewhere of that same scene shot six different ways. None of

which would have been possible with a simple pre-programmed sequence. So there

you have it. That's one of the major advantages of using real computers interactivity. And there's one real

interactivity. And there's one real actual movie that you've heard of that this machine was used in. And that's

probably about it. While Sparkology

served on several other projects, I think Scream was by far the most successful, everything else on this machine seems to be from much lesser known works. Uh, there are a couple

known works. Uh, there are a couple other fakey apps, for instance, including one that seems to have been used in the ill- fated 1997 series Orleans, which only ran for eight episodes and is so rare I couldn't

actually find a DVD or any videos at all. Uh, but the last file in here,

all. Uh, but the last file in here, Edith Scroll.exe, also ended up in a

Edith Scroll.exe, also ended up in a genuine and possibly even notable production, though, uh, perhaps the better term should be notorious. Edith

scroll is a good example of uh what you might use for background noise in in a movie. It's something you'd put on the

movie. It's something you'd put on the PC that isn't the hero in a given scene.

Uh EDIs or the Emergency Digital Information Service was a real world uh municipal project in San Diego uh that was meant to provide uh basically a local version of the emergency alert

system to help prepare for and deal with disasters. They created it in the wake

disasters. They created it in the wake of the Lomma Prieta earthquake. It looks

like they had an email distribution list uh for notices which presumably all emergency responders would be subscribed to. And here we have a handful of

to. And here we have a handful of archived posts from that list. Um

there's one in there rescending a flood warning, one's a high wind warning, that sort of thing. So all of these have been cobbled together into a script which is decorated with some anzy sequences to

color some of the text. And then there's the occasional control code that tells it to pause for a moment after each message. And then fakey just plays that

message. And then fakey just plays that script over and over in a loop. Now, up

close, you'd quickly realize it's just the same five or six messages repeating, but this is meant to appear in the medium to deep background of a scene where the audience wouldn't really pay much attention, assuming they could make

out the details at all. In that role, it looked genuine enough. So, where was this used?

In Nash Bridges, one of the worst TV shows I have ever seen. I'm sure this is going to net me some static because I've heard some people actually liked this thing, but I gotta be honest, I don't

understand how it's a train wreck. Uh,

so this was a cop show, right? Starring

Dawn Johnson in his first major role since Miami Vice. In fact, I got a hold of it on a whim after watching most of Miami Vice and getting curious about what happened to Dawn. I knew his career

hadn't gone that well, but I was not prepared to find out where he landed.

This show is so awful, it ruined my life for weeks. I couldn't stop thinking

for weeks. I couldn't stop thinking about it. Even now, it feels like a

about it. Even now, it feels like a fever dream. Given its production

fever dream. Given its production values, it feels like they should have shot six episodes, then aired four before getting pulled. But instead, it

ran for six seasons against all logic because it's made entirely of bizarre, off-putting decisions. It It starts out

off-putting decisions. It It starts out with a bizarre premise that Dawn's character is an amateur magician who uses magic tricks to catch crooks. This

gets forgotten after the second episode because it's it's just you can't do anything with that, right? So, they just they just pretend that never happened.

Um, the camera is constantly dutched for no reason. All the shots are zoomed in

no reason. All the shots are zoomed in so tight they're claustrophobic. The

scripts are rushed. They're packed with gaping plot holes. All the character interplay is extremely clipped and artificial. We are told people have

artificial. We are told people have relationships, but never shown why that would be. And also Cheich Marin is there

would be. And also Cheich Marin is there supposedly as a foil to Dawn, but mostly he just gets ignored uh or goes off and has his own goofy bplots. Every element

of the show is inexplicable at its best and atrocious at its worst. And as mean as that sounds, I'm pulling my punches here. I wrote like six paragraphs of

here. I wrote like six paragraphs of commentary on the show, and it was so vitriolic, I decided to just back off a little. But I I I still can't help but

little. But I I I still can't help but label it trash. It would be unwatchable if not for the continuous mental gymnastics it puts you through trying to figure out how it got made, why it's the way it is and how it stayed on the air.

And the only saving grace I can offer is that I couldn't make it past season 2.

So maybe it gets better. But regardless,

even a bad show needs sets. And since

this one was made in the '90s, a ton of them involve computers. In this scene, we can clearly see the Edith script running on a loop in the background while the characters talk in the foreground. So, there's another role

foreground. So, there's another role this thing played, and it seems like it's where Sparkology did a big chunk of their work. There's computers all

their work. There's computers all throughout the series, and I get the strong vibe that a lot of them were driven by these machines, particularly in the first couple seasons since I have a bunch of source files from those episodes. With the exception of the EDIS

episodes. With the exception of the EDIS script, however, they all run under Windows. So, let's get back into that.

Windows. So, let's get back into that.

So, when you start Windows, sir, go away. When you start Windows, this

away. When you start Windows, this program group called Spark opens itself, and it contains a bunch of generally useful utilities for this thing's job, like the uh the test pattern app. That

one does exactly what it says on the tin. Uh CRTs very easily fall out of

tin. Uh CRTs very easily fall out of calibration, and this gives you a bunch of patterns you can use to calibrate them before doing a shot. Uh DisplayMate

does uh pretty much the same thing, just with a bunch of extra tests, including uh color pallet diagnostics. Uh but

snapshot, on the other hand, that's interesting. Uh, this is a control panel

interesting. Uh, this is a control panel for the very special graphics card that's in this machine. You may have noticed when we looked inside earlier that the video card is unusually large,

and that's because it's also a very early video capture card called the Cardinal Snap Plus. Uh, now one half of this is a perfectly ordinary Sing

ET4000based VGA card. Nothing special,

but then there's an onboard analog to digital converter. And those D plugs on

digital converter. And those D plugs on the back connect to breakout cables to let you plug in composite, Svideo, or even RGB signals from a camera or VCR.

Uh, I believe you could use this to grab both images and video, and I'd love to demo that, but this one seems to have died. I've played with the drivers and

died. I've played with the drivers and tried injecting inputs all over the place, but all I can get is a blank screen and some weird analog distortion, so I'm pretty sure it gave up the ghost.

But when it was working, this app would let you, you know, um, select inputs, tweak the colors, and some other things that I'll show you in a little bit. But

this is ultimately just an off-the-shelf piece of software for a card you could buy in a catalog and put in any computer. The one other unique piece of

computer. The one other unique piece of software on here is WindSpark, which we saw briefly earlier when I used it to turn on the 48 Hz mode. And as it turns out, while this is a custom piece of

software written by Sparkology, it doesn't actually have that many unique functions. As it turns out, about 95% of

functions. As it turns out, about 95% of this program is just a slimmed down version of that snapshot app. Uh, so it just lets you get to some of the controls in there a little more quickly.

And I'll I'll show you why that's relevant to this thing's job later. But

besides the video capture features and the 48 Hz toggle, the only other thing in here is the gen lock control. So

yeah, about that. I mentioned much earlier and then again later probably all over the place that the timing of a 48 Herz modified TV uh or computer will produce a solid clean looking image but

there will be a tear in the middle because the camera's shutter isn't in perfect phase with the display. So it'll

look like you're getting a clean picture but you're actually getting half of one frame and half of the next. Let's um

let's get the monitor back and see if I can show you that trinatrons man. I'm telling you,

trinatrons man. I'm telling you, nothing needs to weigh that much. Oh,

I'm sorry. This is a Nokia, not a Sony.

This is an Aperture grill, not a Trinitron.

Yeah. So, if I if I drag this around, I don't know if there'll be a split on the screen right now, but uh trust me, there there is there's a spot on the screen where there's a tear between one frame

and the next. I think it's about in the center. I think I can see it on my

center. I think I can see it on my preview screen. Uh, and it would be nice

preview screen. Uh, and it would be nice if that wasn't there, and it's fixable.

Uh, 48 Hz modified TVs can be synced to a camera shutter, and so can this machine. Uh, the gen lock ports on the

machine. Uh, the gen lock ports on the back accept the same kind of sync pulse that movie cameras output, or at least that's what Marty told me. I don't

actually have a camera to test this with, so I can't demo it. But in theory, if you plugged a sync signal into this, then clicked the gen lock on button in the window here, you'd get a perfectly

clean image.

But I don't know that you actually needed to do that, at least if you had the right gear and knowledge. Uh, you

see, my other two machines, the compacts, have no sync inputs, and that would seem to suggest they're lower grade, better suited for background props, static images, that sort of thing, where perfection isn't essential.

But I believe that all of these could be adjusted for a perfect picture without actual gen locking. Uh you'll recall when we looked inside those machines earlier, the compacts each have a little

adjustment potentiometer next to their timing crystals. And this machine has

timing crystals. And this machine has one on the front. And what all three of those do is let you finetune the frequency of the oscillator. Um suppose

your your camera is just barely off of a perfect 48 hertz, which is very possible given that they're mechanical devices.

Uh that pot lets you tune the PC's video output to match it. And we can actually simulate what that would look like if we uh very gently rotate this machine. This

has a spinning hard drive in it. So I am terrified of head crashing it. But we'll

just rotate this and get a straight on shot there. So we

get less of that curvature. Okay. Now if

I get a screwdriver and we start spinning that. Now this is a 10turn

spinning that. Now this is a 10turn potentiometer. So it takes 10 turns to

potentiometer. So it takes 10 turns to go from one end to the other which is 20 half turns. So, I'll be at this for a

half turns. So, I'll be at this for a bit, but if we watch closely, if I put a bunch of turns on this, we should see.

Let's give it a moment.

There it is. There it is. There's the

line. So, that is a slight timing difference between the video card and the camera. So, the camera is shooting

the camera. So, the camera is shooting at precisely 148th of a second. So,

we're seeing a little bit of overlap, which means the card is running fast.

it's outputting at um you know 48.2 hertz, right? So, we're getting a little

hertz, right? So, we're getting a little bit of overlap between two frames. Uh

this is what it might look like if your camera's shutter was running a little bit slow, you know, um 47.9 hertz or whatever. Now, I I don't know that you'd

whatever. Now, I I don't know that you'd be able to diagnose this kind of error on a real film camera without just uh shooting the scene, developing the film, and and finding out what happens, right?

I've seen a lot of old film pros talking about CRTs who describe adjusting their shutters or their their sink boxes if they have that. Uh, but this seems too subtle to adjust by eye. I don't think

this scan line would be visible on tissue paper or even ground glass. So,

visualizing it on set seems pretty much impossible. But if you hooked the VGA

impossible. But if you hooked the VGA card's VSync output up to an oscilloscope along with the shutter sync pulse from your camera and adjusted this until they matched, I think you could

fix it. And that would also let you

fix it. And that would also let you manually phase the two devices. Uh, so

right now I've got the timing thrown way out of whack, right? So we keep getting this bar that races around the screen, but when it comes around again, okay, let's turn it way down. Way down, way down, way down, way down. Come on, come

on, come on, come on. Okay, it's gone.

And now if we go back just a little bit.

Oh, there it is. See that little black bar? That's cuz we're running a little

bar? That's cuz we're running a little slow. So the CRT isn't quite drawing the

slow. So the CRT isn't quite drawing the whole thing. But if I get it just right,

whole thing. But if I get it just right, it's gone. Right now we're at a perfect

it's gone. Right now we're at a perfect 48.00. But but think about this. We turn

48.00. But but think about this. We turn

this up a little bit till we can see it.

Okay, there it is. And now if I turn this down and wait and just time it just right. Okay, it's off the bottom. Give

right. Okay, it's off the bottom. Give

it a couple twists. All right, the split is still there, but it's down here or up here, so it doesn't matter, right? It's

in the vertical blanking period, so we should have a perfect picture. Now, do

we? I have no idea. I don't have the right setup to test this at all. Uh, but

maybe right I can do this here um by just eyeballing it because I have a digital camera. I

have a live preview feed over there. I

can see exactly what's going on. But I I do think if you had the sync signals from camera and and computer on an oscilloscope, a trained technician could do the same thing on set. So even in a situation where you couldn't truly genlock the camera and computer

together, a perfect image should still be achievable, I would think. But with

that, you've now seen the last technical detail about this machine. Those are all the custom hardware and software features. So let's get rid of this thing

features. So let's get rid of this thing again and take a look at the actual payload, what this machine was used for.

Oh, it's so heavy. I keep forgetting.

I'm going to go over and click this file called the web. See you see me. Now,

when we fire this up, we're going to get a Macromedia logo. And I'll go into some more detail on this later, but just so you aren't wondering, we're about to watch an animation that was made in Macromedia Director that was sort of

like an ancient forerunner to Flash. So,

if you know anything about that, just think of this as the same thing, just 10 to 15 years older. Uh, we now have a full screen image uh with an awful lot going on. First off, we can see this was

going on. First off, we can see this was made for Nash Bridges episode 209 scenes 911, 28,51, 53, and 54 by a company called Man Consulting that worked very

closely with Sparkology. Uh there's also a list of basically chapters. We have

options here to jump to different parts of the animation or we can hit enter to just start it from the top or we can click on the begin button because director supports mouse input. So, we do

that. The screen goes blank and it stays

that. The screen goes blank and it stays that way till we click again or press enter. Then we get a boot sequence for

enter. Then we get a boot sequence for another hilarious Hollywood operating system. And right away, you can see one

system. And right away, you can see one of my favorite things about looking behind the scenes, particularly on old standard deaf TV productions. Uh, the

artists knew the audience would be watching over a blurry lowres TV. So,

they often buried little jokes in the source files that they knew nobody would ever be able to make out. Uh, so if we back up a couple frames, we have icons for Earth Monkey and Frog Bump Mail,

which I think are just pretty funny names. Uh, but there's also a directory

names. Uh, but there's also a directory called um kidnapping to-do. And

likewise, in one of the other episodes dealing with the mayor of San Francisco, there's a fake meeting calendar that you only see for a moment on screen and from quite a distance. But if you look at the file, you find out they have calendar

items like deny allegations against me, establish plausible deniability, create enemies list, and reformat hard disk.

It's great stuff. Anyway, nothing on this screen is interactive, so it'll just sit here until we press shift I.

Uh, probably done by someone off camera at the director's queue. Then we get a incoming message dialogue. And once

again, the receive button is clickable, which means if they wanted to shoot over the character's shoulder, they could have the real mouse hooked up to the PC, have the actor mouse over and click on it, and everything will look very

natural. Uh, and when we do that, we get

natural. Uh, and when we do that, we get a pretty ridiculous looking mockup of a video conferencing app, or at least that's what your first reaction might

be. Uh, now I know we all love making

be. Uh, now I know we all love making fun of Hollywood UI, uh, but this one's actually not that absurd. I mean, it isn't using any standard operating system toolkit, but it was really common for software in the 90s and 2000s to

have custom interfaces full of gradients and skuomorphism. Uh, this doesn't look

and skuomorphism. Uh, this doesn't look all that far off from the software for my TV tuner or, you know, Windows Media Player 7. And most of the UI elements

Player 7. And most of the UI elements are plausible. The whole sidebar, for

are plausible. The whole sidebar, for instance, makes perfect sense. Uh, we

have a stat breakdown, number of packets received, total data transferred, lost packets, then the resulting frame and bit rates. Uh there's also an IP address

bit rates. Uh there's also an IP address for the other party. Uh and it even mentions H320 which was in fact the standard codec for ISDN video connections and which carried forward to

early IPbased conferencing. So the

people who designed this definitely did their homework. Uh in fact CUCI was the

their homework. Uh in fact CUCI was the name of a real video conferencing app developed at Cornell University and later sold commercially. So the file name is apt though obviously they had to

Greek it into CU video 2000 in the actual onscreen art. Uh either way, this punches way above the usual Hollywood believability, and it shows us yet another advantage to using a real

computer. While the bulk of this UI is,

computer. While the bulk of this UI is, of course, just a static bit map they whipped up in Photoshop, all these counters over here are real text fields, every time the screen updates, an

internal script increments these numbers. So, they continue going up as

numbers. So, they continue going up as long as the scene's playing. At this

point, you're seeing what I'm getting at. The the big advantage of using a

at. The the big advantage of using a real computer in my mind is interactivity. Not just because your

interactivity. Not just because your actor's actions can correlate perfectly to what happens on screen, but also because the machine can pause, loop, or extend a sequence indefinitely, giving

you unlimited flexibility in how you shoot it. If this were a tape, then the

shoot it. If this were a tape, then the production would have asked the VFX people to generate 5, 10, 20 minutes of footage, uh, which would be far more than they need for the scene itself. But

if the shooting process ran long for whatever reason, say um they're doing retakes, they'd have to periodically rewind the tape, queue it up, and play it again, forcing everyone to stop and wait each time. And if someone fails to

notice that the tape is run out, you can lose an otherwise perfect take. I've

actually run into this myself. Eons ago,

when I did my video on running a hard drive without a cover, I had a device playing background video behind me and it just ran out halfway through and I didn't notice. So, there's just seven

didn't notice. So, there's just seven minutes of a static menu screen in there, which is pretty embarrassing, but I had nobody to spot it for me. And even

if I'd noticed, fixing it would have taken me out of the groove. The process

of filming is already sluggish and fragile, and the last thing you want is more interruptions.

And when it comes to this particular scene, there's a minor but mentionable advantage on top of that. Uh, because

the counters on the screen are dynamically generated, they'll always look natural. I I'm sure you've seen a

look natural. I I'm sure you've seen a computer screen in a movie before that was obviously cycling through uh four or five numbers repeatedly and it stands out like a sore thumb even when it's just a blurry silhouette in the

background. These, however, will always

background. These, however, will always look natural, even up close, cuz they just keep going up forever. And if all that still isn't enough benefit, you also get the chapter skip functionality.

If they had to do 20 takes of the initial reaction shot to the incoming message screen, they could have just had someone off screen hitting shiftB and then shift I over and over to

re-trigger it. You're not waiting to

re-trigger it. You're not waiting to wind the tape back. Oop, we missed it.

Go for it. Oop, we missed it, etc. This is all obviously good stuff, but let's go back to the window here. Because you

might be wondering, if this is a video conferencing app, where's the video?

Well, I don't know for sure how they did this, but based on all the clues, I think I see the trick. Obviously, we'd

want to have a video of a person talking here, right? And nowadays, you just drop

here, right? And nowadays, you just drop an MP4 file into the animation, crop it to the box, and Bob's your uncle. But

that was a tall order for a PC of this era. A 486SX could barely play MPEG 1,

era. A 486SX could barely play MPEG 1, let alone while drawing the rest of this UI. So, doing it all in software was out

UI. So, doing it all in software was out of the question. But remember, we have a video capture card that's directly integrated with our VGA card. So, here's

what we can do. Let's bail out of this.

And then we're going to start that snapshot app again. We go to file, open, and pick this, which is a screenshot of the uh video conference. Now, this is

going to look kind of weird. For one

thing, uh we're missing part of the picture. I'm actually not sure why that

picture. I'm actually not sure why that is. Uh but we've got enough for our

is. Uh but we've got enough for our purposes. Um, but also this is a 256

purposes. Um, but also this is a 256 color video mode and that's paladized which means rather than every pixel on the screen containing a red, green, and

blue value to define its color, there are instead 256 slots for colors that everything on the screen has to share.

Uh, so when we load the director movie, it loads a custom palette that makes it look exactly like Man Consulting wanted.

Uh, but when you take a screenshot, it doesn't save that palette. So, we are seeing their artwork interpreted in the default Windows colors, which doesn't look great, but do you notice how the

area where the video is obviously supposed to appear is hot magenta. Uh,

classically, PC artists have always used the magenta slot in the default VGA palette to represent transparency. I

can't remember if that's because it's the last entry in the table or if it's just a fairly useless color, so you can make it unavailable without hurting too many people's feelings. But either way, uh, when you look at, say, sprites in a

video game, you'll often see this color used for parts that are supposed to be transparent. Uh, here, uh, Man

transparent. Uh, here, uh, Man Consulting did the same thing. They just

redefined it, uh, to a a pleasant kind of muted wheat color. But, uh, seeing it this way, the intent should be clear.

Yes. Um, if we go up to overlay and set color key and then we click here, that pink turns black. But what you're really

seeing is that color punching through to captured video. Uh, this card, like I

captured video. Uh, this card, like I said, doesn't work. So, we can't really put anything there. But, let me hang on.

Let me go into the settings. I tried

this once before and I was able to get some junk to appear there. So, let's

see. We do Oh, you see that there was some junk. Oh, there we go. Okay. you

some junk. Oh, there we go. Okay. you

you can sort of see there's some like trashed video in the background from the busted analog digital converter that's shining through there. Okay. Uh so if we had a camera or VCR connected and this

was working, we would see the picture there chroma keyed into the video conferencing app. And what's better is

conferencing app. And what's better is if we go back to this director video.

Come on. I had it working. Fit in client window. Okay, there. I can't quite get

window. Okay, there. I can't quite get it aligned correctly, but you you can see what I'm what I'm putting down here, right? Uh, for this scene, they would

right? Uh, for this scene, they would have plugged a VCR into the capture card, hit play, then played the director file, and the result probably would have looked like this. The the full motion video feed of the other caller

materializing one bit at a time. And

this is a pretty neat trick. It It's a nice way of working around the limitations of contemporary PC hardware, and I feel pretty confident this is what they had in mind. Uh, so it's a real shame that it's not actually what they

did. Something I've discovered going

did. Something I've discovered going through the files on these machines is that Man Consulting was either given a lot of latitude or the writers asked for a lot more than they were prepared to use because I've gone through the show

on DVD and tracked down scenes matching up with the files on this computer and it seems like they never used anywhere close to everything they made. Um, in

this case, the file we've been looking at was created for season 2, episode 16, the web. And here's what it looks like

the web. And here's what it looks like in the final product.

What are you putting together here, Andy?

>> You have a video message. Let

>> me go. Please let me go.

>> You have a video message, >> oh god. They didn't use any of the bootup sequence. Uh, we never see the

bootup sequence. Uh, we never see the desktop. It just goes straight to the

desktop. It just goes straight to the incoming message window, and they didn't even key in the video feed. Instead,

they've done an old school burnin and added the footage in post. You can tell because it shows up in these big chunks that are much easier to mask off by hand in Premiere or whatever. Uh, and in fact, they use this footage several

times throughout the episode, and in one shot, it's at an angle where it's even more obvious that they did it by hand because none of the perspective lines agree with each other. So, I guess the whole concept just didn't work for some

reason, and they had to hastily bodgege together another solution. Though, I'm

not sure why they didn't just color key it in post exactly the way I did for my demo. Couldn't they have just done this

demo. Couldn't they have just done this in Adobe Premiere?

If you worked on this show, can you tell me what the hell was going on over there? Because I've got another director

there? Because I've got another director file on here that's also bizarre. Check

this out.

More and more questions as we go. It's

just more and stranger questions. Here

we go. Bad FTP.exe.

Hilarious name, but an accurate one. All

right. So, again, this starts out with another hilariously fake operating system. Uh, but it's a a more or less

system. Uh, but it's a a more or less plausible FTP app. Uh when you hit the keyboard shortcut or hit get file, uh it plays a sequence of a file being transferred. And if you pay close

transferred. And if you pay close attention, it appears to be the source code for the Unix UUCP suite. Uh which

means season 2, episode 16 of Nash Bridges is either open source or violates the GPL. And I can't decide which one's funnier. Anyway though,

that's all this does. And it's a very brief clip, but let's look at where it was used in the show.

>> Okay, here we go. We are sending What the hell?

What? I can't.

>> Okay, there it is. Playing inside of an MS Paint window. What happened here?

Well, again, I have no idea, but a plausible scenario is that they shot footage of the Sparkology machine running the director movie, and for whatever reason, it wasn't usable. Maybe

it was too close up or out of focus or shaky or framed weird. Either way, just didn't look great. And by the time they realized the machine was no longer on set, and they had, you know, 14 hours to broadcast. So, they got another computer

broadcast. So, they got another computer they had on hand, launched MS Paint, filled the screen with blue, got a picture of that, then keyed the iffy footage in underneath it, so you at least get a nice crisp border, even if

the underlying video is pretty rough.

Did this happen? I have no idea.

And even if it did, it still leaves some dangling questions, not least of which, why didn't they set paint to full screen so you don't see the toolbar? And

actually, reviewing this during editing, I looked closer at the picture, and I think there's even more going on here because the paint window is actually cut off on both sides. And okay, sure, the monitor could have been miscalibrated,

but the way the picture cuts off and the reflections on the bezel make me think the whole thing is a burn-in. Uh, that

is to say, they took a shot of a real monitor, then superimposed a paint window in post, and then keyed on top of that. Now, that would be truly bizarre.

that. Now, that would be truly bizarre.

And maybe I'm wrong about the specifics.

Maybe they did just have the height and width of the picture set too high. But

that still leaves us wondering why they didn't just make the paint window full screen so the whole picture would be blue. Or, you know, if they're doing a

blue. Or, you know, if they're doing a burnin anyway, just skip the paint step entirely and just burn in the source footage. It's It's ludicrous. And the

footage. It's It's ludicrous. And the

worst part is uh much later in the episode it happens again with a different monitor and different footage.

So whatever caused this, it seems to have been pervasive. And what makes this even weirder is that the paint window seems to have come from a machine running Windows 95, not Windows 3 like

this machine. That suggests that

this machine. That suggests that whatever did go wrong when it came time to fix it, this system wasn't available for pickup shots. Yet, we know it was present at some point. partly because it has all the files on it, but also

because we can literally see it in the show. It was used in this scene as a

show. It was used in this scene as a prop and you never see it from the front and there's a bunch of crap stacked on top, but it's definitely the same box.

You can see the BNC connectors and switches and whatnot. So, it was there and then I guess later it wasn't for some reason. Who knows why? And you're

some reason. Who knows why? And you're

surely getting the gist at this point.

Uh there are other files on this machine, but I don't think we need to go through all of them, especially since most seem to have been used in very

minor ways. Uh for instance, E11S7

minor ways. Uh for instance, E11S7 is a scene from season 2 episode 5 trackown where it appears for like a few frames. That's generally what we're

frames. That's generally what we're talking about here, right? So rather

than trudge through all of it, let's just take a quick look at how some of the sausage was made and then we can wrap up. This is Macromedia Director

wrap up. This is Macromedia Director version 4.0. It was released in 1994, so

version 4.0. It was released in 1994, so it's contemporary with the rest of the machine, though it would have been a few years out of date by the time these productions were happening. And like I said, this is very similar to

Macromedia's later Flash product in the sense that it's a tool for authoring two-dimensional multi-layered computer animations with dynamic and interactive elements. And honestly, I wouldn't even

elements. And honestly, I wouldn't even bother showing it to you, except that it lets us see the shape of some of the problems the VFX artists may have run into. Uh, and I just figured I'd give a

into. Uh, and I just figured I'd give a quick rundown of how it worked while I was at it. So, um, for a good example, let's, uh, go open up Voice 3. Dur. Now,

uh, this machine really wants to run at 640x480. Uh, I have not been able to

640x480. Uh, I have not been able to coax higher resolutions out of it, so the screen real estate is a little cramped, but bear with me. All right.

So, this is one of those cheesy voice analyzer things that you see on every cop show. It's going to put two

cop show. It's going to put two waveforms on the screen, then compare them, right? Yeah. Yeah. We've all seen

them, right? Yeah. Yeah. We've all seen it before, but uh let's open up the playback controls and hit play. All

right. So, it sure does draw a waveform.

And then it stops. In order to get it to proceed, I have to click skip frame, then hit play again. And now it does the next part of the animation. this um

little analysis routine and then it stops again. All right. Uh to see what's

stops again. All right. Uh to see what's going on here, let's go take a look at the score window. Uh this is the term that director used for what we now call a timeline. So each horizontal space on

a timeline. So each horizontal space on the grid is a frame. And if we hit play, you can see the timeline uh playhead uh takes off moving from left to right at

30 frames per second, which is really funny. I mean, the whole point of this

funny. I mean, the whole point of this endeavor was to get native 24 fps animations and then they authored them all at 30. Does this matter? Probably

not, but it's still really funny.

Anyway, this is just a typical linear timeline. So, if you've ever used a

timeline. So, if you've ever used a video editor or anything, this will be pretty familiar. Although, it would have

pretty familiar. Although, it would have been pretty alien looking in those days, especially to the broader public. Almost

nobody had ever used a linear editor before. So, uh, there's actually a scene

before. So, uh, there's actually a scene in season 2, episode 3, where I think they meant to leave one of the director movies playing for a background effect, but someone forgot to hide the timeline,

so it just shows up in the show. I'm

sure they noticed this. I'm sure it wasn't intended, but probably nobody felt it was worth re-shooting because who's going to recognize this as a production tool in those days? It just

looks like mysterious cop software. And

to be fair, it's a bit more alien than our jaded modern eyes might assume. Uh,

see, like I said, the timeline plays left to right. So, each one of these is a frame. And then these little numbers

a frame. And then these little numbers in the squares down here, these represent cast members. That that's

their term uh for assets, you know, uh like bit maps, vector shapes, text, uh that sort of thing. So, for each frame that a shape appears on, this number tells you which one it'll be. And then

if we click on it, you can see uh a little thumbnail of it up there. So

there's the background of the whole image. Um and then this here is uh some

image. Um and then this here is uh some of the uh the the grid that's laid out on top of it. And then down here we've got the waveform itself. But do you

notice something? This is backwards.

notice something? This is backwards.

Like as we go down, each row is successively higher in the stack. So,

uh, the waveform is on top of the background image because it's below it.

Huh.

After I stared at this for a while, I started to feel like it had a certain logic to it, but I've never seen it done before. So, yeah, I don't know what to

before. So, yeah, I don't know what to think of that. But anyway, um if we go back to the canvas here, uh or well what they call the stage. Uh we can grab

these elements and just drag them around. Boy, that took a while. Uh and that's that's how this

while. Uh and that's that's how this works. That's that's how you make an

works. That's that's how you make an animation. Uh you just put stuff on the

animation. Uh you just put stuff on the screen and then for each frame you move it where you want it. When you're done, you hit play and you've got your movie.

Uh so the only other thing worth commenting on here, if we go back to the score, are these little tabs you'll see at the top of the timeline. These are

basically chapter markers. So, uh, this is how you're able to skip to different places in the animation, right? There's

a little bit of scripting built into the timeline so that every time you land on one of these frames, it just freezes.

That's what, um, this little script here does. See, on exit frame, go to the

does. See, on exit frame, go to the frame, uh, is short for uh, just keep playing this frame over and over. That's

why you have to skip over it with the keyboard or by jumping to another chapter. Uh likewise for looping

chapter. Uh likewise for looping animations. This would point back to the

animations. This would point back to the beginning of the sequence. So it just repeats forever until you skip out of it. And that's pretty much all we need

it. And that's pretty much all we need to see here. There's obviously a lot of complexity to this app, but I just wanted you to have a basic idea how these animations were made. And again,

this is this is nothing special. These

are are pretty universal animation concepts. I was doing all this stuff in

concepts. I was doing all this stuff in Flash back in 2001. So frankly, the only thing that surprised me was just finding out that all of this was available in 1994. I I didn't really think this kind

1994. I I didn't really think this kind of animation software was was really widely available until at least a few years later. And as it turns out,

years later. And as it turns out, Director's Lineage goes back even further than that. The earliest versions were actually sold under the Macro Mind

brand all the way back in 1985, which for all I know shoots holes in my theory from earlier. It's entirely

possible that this exact software could have been used on Star Trek 4. Although,

I still wonder if it could have done it well. Uh, for a case in point, let me

well. Uh, for a case in point, let me show you some limitations that these folks apparently ran into almost 10 years later. So, this audio analysis

years later. So, this audio analysis thing was used in Nash Bridg's season 1, episode 4, High Impact. And the scene is a little out of order versus our file here, but all the parts are there. They

pull up a waveform, duplicate it, and run some kind of analysis. But if you look closely, you'll notice our file isn't quite the same. Their waveform is alpha blended. You can see stuff behind

alpha blended. You can see stuff behind it while it moves. Well, ours has a big white box. What's all that about? Well,

white box. What's all that about? Well,

let's go in and open voice. Dur, which

is dated about 3 months earlier, and then play this version of the animation.

This one has the transparent background.

So, why' they change it to the worse opaque one? Well, let's go to the uh

opaque one? Well, let's go to the uh scanning sequence. And when we hit play,

scanning sequence. And when we hit play, we get the same scanning effect except that now there's this yellow highlight box that's trailing behind the uh bars

and that's blended on top of the waveform. That blending effect is

waveform. That blending effect is murdering this poor little 486. All the

math involved in adding those colors together is just trashing this little processor. So, while the animation is

processor. So, while the animation is supposed to play at 30 fps, we're getting more like two.

So, what happened here? Well, at a guess I'd say the authors at Man Consulting developed and demoed this on a pennium where it played just fine. And then

months later, when the show actually got into production, they tried playing it on this machine, found out it was unusably slow, and had to whip up a new version that lacked the costly alpha blending effect. Curiously, however, the

blending effect. Curiously, however, the effect, as seen in the show, has the alpha blended waveform. So, I'd guess because that animated fairly smoothly, uh, they just shot footage of both versions and then spliced them together.

you know, typical movie magic. So, going

back to my Star Trek hypothesis, even if they could have built the animations in director, there's no guarantee the Mac could have played them smoothly. And

with that, I think we're just about done. Like I said, there are a few other

done. Like I said, there are a few other files on this thing, but nothing particularly exciting. Uh, for instance,

particularly exciting. Uh, for instance, there's some stuff that may have been used in the NBC show Profiler, uh, since all the names in this list come up as people who work together on that show and know others. but I couldn't find any

examples of it in in the actual show.

And there's also a clip that was used very briefly in an Eddie Murphy vehicle from 1997 named Metro, which I've never heard of. And judging from Rotten

heard of. And judging from Rotten Tomatoes, that's neither a shock nor a tragedy. There's only a couple computer

tragedy. There's only a couple computer scenes in the whole film, and the only relevant one is this out of focus shot that lasts four or 5 seconds. I do have the director file for this, and it's another example of the source being much

more complex than what was actually used. There's a whole startup sequence

used. There's a whole startup sequence and a lengthy animation, none of which made it into the film. Um, but other than that, everything else on here is just various bits and pieces for Nash Bridges. And um, either way, I think

Bridges. And um, either way, I think I've made all the points I wanted to.

Like I said earlier, when it comes down to it, this part of the video was just me showing you how one company approached onset graphics. At one point in time, they used Director, but someone else working on a faster machine might

have used Adobe After Effects and just spit out a plain old video with some built-in chapter breaks and then played that in a typical media player. It's all

just pixels in the end. So, the choice of how to produce them is no different than it would be for any other project.

And likewise, it should be pretty clear that the PCs themselves are not particularly remarkable either. Uh, like

I said at the beginning, they are mostly just off-the-shelf machines with the crystal oscillators replaced. And that's

not to say that that isn't an achievement. Marty clearly had to do

achievement. Marty clearly had to do quite a bit of reverse engineering to make this happen. Uh, and in fact, the oscillators he installed have his own brand name on them, something I've never

seen in any product in my life, and which suggests he actually couldn't get the correct frequency parts off the shelf and had to get a custom run made.

Obviously, this was not an easy project, but it also wasn't a complex one. I

mean, even in this machine, uh, with the snap card and that whole nest of wiring going all over the place, the modifications only actually touch the card in two places. One presumably a

sync input and the other presumably the timing oscillator. So, the only thing

timing oscillator. So, the only thing that makes these machines special is their ability to run at 48 hertz. But

the way that was achieved is not tremendously novel, nor were any of the things they did or the ways they did them. And yet, this is still a

them. And yet, this is still a fascinating machine. On the one hand, it

fascinating machine. On the one hand, it doesn't do much. But on the other, can your PC do this stuff? Probably not. I

might be one of 10 people in the world that are still capable of this. So, I'm

probably going to keep at least this one on hand because it's just too cool to have this ability. And hell, I might even be able to do something with it. As

it turns out, I actually know someone who owns a film movie camera. I mean,

it's a really old RE16S that doesn't even have a regulated 24 fps drive, but we're still going to try to get together and do some test shots with it. Um,

however, if anyone in the PNW area has access to a real honest to god 35mm production setup and wants to screw around, let me know because I would love to see this thing doing what it was

built to do. I think, however, that at this point, I've shown you everything I can today. I realize this was extremely

can today. I realize this was extremely long, but I just didn't know how else to approach the subject. Like I said at the beginning, it's it's not a complex topic if you already understand most of it.

But when you break it down, you realize that it's like 12 different areas of knowledge. So chances are, while a lot

knowledge. So chances are, while a lot of people watching knew some of it, I doubt that many of you knew all of it.

And it was impossible to guess what you would and wouldn't. So I decided to just cover it all, at least as best I could.

Since I am not an expert on this stuff, I had to brush up on a number of things I didn't know that well, and I'm hoping I don't get called out too bad for factual errors. But I'm sure there'll be

factual errors. But I'm sure there'll be some simply because of the sheer number of things I only know about because I read a single forum post or a magazine article from 40 years ago, and I have no industry contexts to confirm any of it

with. Uh, in fact, you may have noticed

with. Uh, in fact, you may have noticed that big chunks of this video were re-shot much later when my hair had grown out a bit because I found out weeks after principal photography that I

had made some egregious mistakes that demanded a total rewrite. And I'm sure I didn't catch it all. So, I'm cringing thinking about that. But hey, the best way to get right answers on the internet

is to post the wrong ones. So, let's see how that goes. If I'm wrong enough, I'll I'll make a follow-up with corrections.

For those who don't have a dog in this race, however, I just want to thank you all for sticking around for what was probably two hours of me just talking.

Uh, this was probably one of the most object light videos I've ever done. I'm

sure I'll make a thumbnail that suggests it's all about this computer, but by this point, I think it's obvious why the PC is only on screen for like 10 minutes. Hopefully, that wasn't too much

minutes. Hopefully, that wasn't too much of a bait and switch. But either way, I hope you enjoyed this on some level. Uh,

if you did, consider subscribing to my channel since this is very much the sort of stuff I try to make on my better days. Uh, and if you really liked it,

days. Uh, and if you really liked it, then consider supporting me on Patreon like these folks are doing. This video

took an obscene amount of time to make.

I had to start over multiple times. I

had to buy a number of devices to make this demo possible. And I simply couldn't have done all of it without the support of my patrons. They have made it possible for me to work on things until they're done without pushing them out

early and unfinished and wrong uh to meet some deadline. And I can't thank you all enough for making that possible.

To everyone else though, thanks for watching.

Loading...

Loading video analysis...