You are not logged in. Please login or register.
Talking about film companies like Kodak and what they could do:
Produce and sell inexpensive digital camera backs or something that would fit inside a typical 35mm camera body. Instead of film, it uses a digital sensor to capture the image, and instead of a canister to store the film, there's electronic bits and somewhere to stick an SD card.
They would sell a bajillion of them, cause you could buy one and use it in any camera made prior to the digital revolution. The 35mm format has been around for, what, 80 years? That's a lot of cameras. I collect cameras like that, but shooting with them is getting awfully expensive. I would love to get my hands on some digital augmentation for them.
I've had the same idea Sqig, incredible that noone's done that yet.
So... I don't want to turn into the cheerleader that always posts, "lol, great episode guys!"... But I really liked that one. The film vs. digital debate is sort of fascinating, but having only ever done anything digitally, I only have one side of the argument to draw on. While I liked last week's, having not seen either film I did feel a little shut-out, but this one was much more accessible.
You were talking about how there's going to be a generational split with regard to frame-rates and 360 shutter and all that, but I don't think it's even going to be a case of generation. I was recently watching Back to the Future on blu-ray at a friend's house with a few other people, and their TV had the motion smoothing turned on. It was driving me crazy, and I brought it up, and they all looked at me like I was Gengis Khan. I'm a big enough asshole that I paused the movie and fiddled around with their TV settings to turn it off, and was relieved to see it playing back at beautiful 24 fps... But everybody else was like, "Wait... did you change the thing? What's different?" We had in the room an age range from 18 to about 50, so all people who had grown up watching things the same way I did, and nobody but me gave a shit, or even noticed.
My point being that it MAY be a generational thing, but I think it will be between people like the people in this community, not the general population. Obviously six folks in a living room in Maine isn't a large enough sample size to make scientific claims, but we still like anecdotal evidence, right?
Great one guys. I like this as an 'extended Intermission' and prefer these sorts of general, connected topics over just talking about recent theatrical releases.
"The biggest difference I have found when working photochemically versus digitally on motion pictures is the length of time the takes can last. Broadly, a 1,000ft roll of 35mm film lasts around nine-and-a-half minutes before running out, while a digital tape or recording card or hard drive can last from 40 minutes to over an hour and a half. This translates to a very different rhythm on the floor; the pressure to "cut" to save film is alleviated.
Archiving digital images is a technological dilemma. The idea of that discovered shoebox of pictures, or wedding album, will not exist digitally in your camera or on your computer or in a "cloud": you should print them. I often feel a photochemical image contains the mass of the subject and dimension; a digital image often feels as if it is mass-less. This could be nostalgia or simply how I learned to see. Others will not have this learning: they will probably never experience a photochemical image. Is this loss a tragedy, a revolution, an evolution? What have we lost, and what have we gained?
I will miss walking on to a photochemical film set. It has a magic to me. When the director says: "Action", and the film is rolling, it feels like something is at stake. It feels important and intense. In a way, death is present in the rolling of that film – we live, right now – and the director says: "Cut". And that moment in time is captured on film, really." -Keanu Reeves
Daaaaaaaaamn Keanu. Respect.
You guys were talking about the way Peter Jackson is shooting The Hobbit at 48 frames-per-second and how you were iffy on it and you saw it only as an interesting experiment. I was reminded on Douglas Trumbull and his ideas about immersive cinema.
Last year Trumbull spoke to FXGuideTV about his ideas and in a weak attempt to wrap my brains around what he was saying I tried to recap it on my "blog". To sum up on his ideas about the 48 frames-per-second he believes in certain instances higher frame rate can save money
Higher Frame Rate Saves Money
Movies today loose so much information in action scenes. The image becomes blurred to the point where you can’t make out what is what. If you go through it frame by frame, right at the peak of the action is where everything becomes completely blurred so you can’t make out what it is, so all the information that should have been captured is gone.
Production value that has been paid for is lost because of the deficiencies of the 24 frame method. Shooting it with higher frame rate has no adverse effects on the production cost. It may even be in some ways less expensive because rendering blur is an expensive digital process. CGI guy’s would rather render five sharp frames than one blurred frame because the blur is a pain in the neck.
In 3D you don’t get that break down in stereoscopic vision on fast action because blurring destroys 3D.
It’s not appropriate for all movies, 24 frames are still fine and appropriate for most movies but for movies like Avatar you want the experience to become more real, more stereoscopic, more immersive. There you benefit from higher frame rates, higher brightness, higher resolution or higher dynamic range is completely appropriate for that kind of a film production.
If you want to read the whole bloody thing then you can find it here.
Really fun and interesting episode. As an avid still photographer, I don't miss photochemical photography. I started out rolling my own film and developing and printing my own black & white photos, and there is a fascinating aspect to that, but I've got far more control in Photoshop. I switched over once 5 MP digital cameras became inexpensive, and I'm on my third DSLR.
I look forward to the day I can shoot indoors in natural light with a fast shutter with no noise, which is already possible with a high-end DSLR. And I want to be able to edit in a high dynamic range that at least approaches the human eye. Film has always been so much more limited than the human eye that snapshot photography has mostly been an exercise in frustration. I just want to shoot what I can see, and celluloid film chemistry is never going to give me that.
I had nothing but frustration trying to understand photography with a film camera. When I got my first DSLR, the fact that I could set the camera, take a shot, and instantly see the results, helped me immensely. I still tell people, if you want to understand photography, get a DSLR, set it to Manual, and start taking pictures.
Damn I miss my D90...
Yeah, I'd previously been handed down a couple of film SLR cameras and went through several rolls on each of them before I was able to start taking pictures that were consistently in focus and not blown out or too dark. Some of the older cameras especially have some weird quirks to them. I think some photographers prefer film just because they love the personality that older cameras seem to have.
That said, I've got a DSLR now (It only shoots stills ) and shooting with it is a hell of a lot easier and much more fun. Not to mention the fact that since it's already digital, you can skip like 5 steps and save time and money and go right into touching them up or color correction, and if you've got a laptop on you while you're shooting, you can literally do that as soon as you've taken the photos.
However, I love looking at photos taken with grainy high ISO B&W film on crappy old cameras. Getting a digital camera to mimic that sort of look can be really hard. They shoot too well, sometimes.
I've got a small collection of my 35mm photos posted online, just some choice ones, pretty much entirely of classmates from VFS, that I took with a handmedown 35mm camera I got from a friend of mine. And I love the look, and was starting to get pretty good with it before I kinda had to shut down the operation cause I couldn't afford it anymore. Once I start getting some cash flow I'd love to get back into it. Luckily I still have my DSLR (D60) that I got for a birthday/grad present from my folks, so that tides me over.
So, can someone Enlighten me how the plan is to treat the DIF vs the Intermission streams? Now I get all episodes twice get all episodes twice which seems redududundant to to me me?
But everybody else was like, "Wait... did you change the thing? What's different?" We had in the room an age range from 18 to about 50, so all people who had grown up watching things the same way I did, and nobody but me gave a shit, or even noticed.
I had that kind of experience 20 years ago with my Dad. We were watching some TV show, and I heard the sound of a character's voice change where they obviously did some ADR work. I mentioned it to him... and he hadn't noticed. To me, that kind of change always just jumped out, but it was invisible to him.
My point being that it MAY be a generational thing, but I think it will be between people like the people in this community, not the general population.
Oh dude, a couple years ago I was over at some youth group thing with the church I was at and they decided they were all going to watch Iron Man, cause they had a blu-ray and HD tv. So I was like cool, I've never actually watched a blu-ray before.
They had the smoothing turned up to max, and I was driven up the fucking walls, but I was the ONLY one who could see it, and I was outnumbered so I couldn't fix it, I couldn't even watch the bloody thing, I just went and like sat in corner by myself or something after about 10 minutes. And for a solid year or so after that, I thought that's what ALL blu-rays looked like, so i was pretty much just like Fuck Blu-ray man. And it wasn't until I I was watching another bluray at some family thing I think, where they didn' t have the smoothing turned on, where I was like...oh shit, bluray is actually kinda awesome.
So yeah, not a fan of the hyper crisp, smoothed out picture. But apparently I'm the only one who notices it.
I'm interested to see what The Hobbit looks like, hopefully it won't be as bad as that.
I can't go into the TV section at Best Buy anymore cause they play the latest blurays with that crap turned on to show off how awesome the TVs are. I wanna go through and turn smoothing off every single one of them every time I go there.
My friends and I did a Firefly marathon a few months ago, and the whole thing looked like a goddamned football game. Luckily, they're all the sort of people who would notice, so I wasn't alone in my outrage.
I don't know, I fall more into the nostalgia camp, than not (which is weird cause I'm barely in my early twenties). You put on a Sergio Leone western, or something gritty from the 70s, and there's just this great "look" to it that I really love, that just screams cinema. All the modern digital movies I've seen (including Fincher's stuff) just look so damned "clean" and crystal-clear sharp that it ironically kind of takes me out of it a bit, some of the magic is lost . Like, Zodiac is probably my favorite Fincher film, but man I wish it was shot on film, because the digital look just gives it this fake glossy sheen that always kinda bugs me (especially given that it's supposed to be a period piece).
Now, obviously there's cinematography choices at play here as well, but I'm not convinced you can replicate that vintage look properly without a ton of work in post, it's just not the same (look at Planet Terror, despite all the scratches and stuff, it still doesn't look properly vintage to me, I can tell it's digital).
Apocolypto looks awful in digital. When people are running through the jungle it looks almost like DV. What's more the digital look somewhat undermines the setting. So whilst digital is ok when it's a contemporary film like Collateral (or a sci-fi), I don't think it suits historical films at all.
I'm concerned about the Hobbit for many reasons and the technical element is among them.
Speaking of the generational attitude, my parents have their TV set to the default shop style, so blues are almost white and the contrast is way off. They don't notice it though and they also don't really notice any difference with blurays. I believe it's because their eyes are able to adjust so quickly and things 'normalise' subconsciously. Like when you're watching a teal film, it's not long before the green looks white again.
See the thing is Exposed Film captures light differently to a CMOS sensor and so this argument is essentially a subjective one. I like to think of it as an acoustic guitar vs an electric guitar, just because you like the sound of the acoustic over electric doesn't mean that it's better, it's just a different aesthetic.
Also concerning Imax, as far as I understand, Regular 65mm is 5 perf whilst IMAX is 15 perf.
I have the same issue with color calibration. In the Store, every effin TV is turned up with the "SUPER CONTRAST" buttons pushed all the way in. I want to go to each and every TV and turn off all "expand contrast" "extra black" "digital vibrance", "XD" or "punch" settings, to get a nice flat-ish filmic image where you can see all the way into the shadows and nothing clips, and it is nice and smooth and somewhat desaturated in appearance.