Jump to content

Talk:Digital cinema/Archive 1

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Archive 1

Sorry, seems like there are facts and bias in dispute in this article

Regarding bias: The use of broadly perjerative terms is troubling. For instance, "stillborn digital revolution" and "at this point, no movie directors are seriously using HD cameras to make theatrical films." Huh? What about Spike Lee, Steven Soderberg, Michael Mann, David Fincher, George Lucas, Bryan Singer, James Cameron, Lars von Trier, and Robert Rodriguez? One failure like "Shark Boy" does not a "stillborn" digital revolution make.

(The author's petulant response to the post alleging bias doesn't help his argument against his bias: denigrating phrases like "weird culture," "adolescent fantasy," "wishful thinking" and "loudmouth dreamer?" Whew! Let's talk rationally here. I doubt the many millions of dollars being invested by films studios, filmmakers and technology companies into evolving video into worthy competition for film involve wishful thinking, unless they're financially incompetent.)

Conversely, some of the ADVANTAGES of digital cinema could also be taken issue with as stated. The "digital cinematography" W-entry claims that the digital format actually does not, in the overall, save money in production, and cites concrete reasons. Which one is right? Which claim about digital cameras' superiority in low-light situations is accurate?

These two articles have been suggested for merger. I suggest a review of the facts and general attitude beforehand. the preceding unsigned comment is by 65.42.107.58 (talk • contribs) 12:31, December 29, 2005

Lucas.. Example: Ep IIIѕʀʟ·08:59, 5 January 2006 (UTC)

Bias against Video

I have a concern about this passage:

"Film is in many ways more portable than its high quality digital counterparts. The chemical process initiated by exposing film to light give reliable results, that are well documented and understood by cinematographers. In contrast every digital camera has a unique response to light and it is very difficult to predict without viewing the results on a monitor or a waveform analyser, increasing the complexity of lighting. However, accurate calibration techniques are being developed which eliminate this as a practical problem, and the possibility of inexpensive post-production color grading can make digital cinematography more flexible than film in achieving artistic color effects."

It seems to be biased to the point that film is better than video. Each is its own medium. While "[t]he chemical process initiated by exposing film to light give reliable results, that are well documented and understood by cinematographers," is true, it is because most cinematographers have little practical experience with video. If a pro videographer were to suddenly try shooting film I'd expects bad results as well. "In contrast every digital camera has a unique response to light," is true, but every different film stock has a different response to color, illumination, shadow, etc. Once a DP knows the digital camera it should be more of a task to get what you want. In fact, a camera like the Sony F-900 has a very predictable response to these conditions, like film, but has an advantage that the response can be tailored. Perhaps the difference in thought is that the visual nuances of film are dure more to the media, while in video it is due to the camera more. Also, the line "without viewing the results on a monitor or a waveform analyser, increasing the complexity of lighting," is fallacious. In fact my DP can get amazing shots on his F-900 without anything but his light meter and camera. And our interior shots were FAR EASIER to shoot than with film, allowing use to light a scene beautifully with about half the equipment and time invested. 66.32.95.85 12:07, August 25, 2005 (PST)

Agree I think this is what wikipedia people call weasel terms or something like that. I'll put a note on it. Fitch 08:44, 8 November 2005 (UTC)


Rubbish!
"It seems to be biased to the point that film is better than video."
Unfortunately, THE INESCAPABLE FACT is that the vast majority of film producers, makers of prime-time TV shows and larger budget commercials are adamant that film IS better than video, and they'll always use film if the budget allows it. They are not all morons; video cameras simply do not produce as good a picture as film, for a variety of technical reasons that most video enthusiasts seem incapable of either understanding or observing.
This is not a "weasel term" either. These are facts: How many movies are released each year? How many were shot with Digital cameras? According to George Lucas in 1999, film should have been dead and buried by now. It hasn't happened. Nothing like it has happened. One of the problems is that many self-styled experts simply cannot tell the difference between film and video, whereas the people responsible for making the decision about what is to be used usually can!
There is a weird "culture" that has grown up on the Internet that seems to be dedicated to denial of the Status Quo. (AKA wishful thinking). Somehow there seems to be a shared "adolescent" fantasy that: "come the (video) revolution" THEY'RE the ones who will be called upon to make the next Jurassic Park, Indiana Jones etc. Which is arrogant nonsense.
I've been hearing the same old statements for over 20 years, since the first Betacams came on the market, and film still remains the preferred medium. Video is just about good enough for low-budget films like "Wolf Creek". That is the situation now, and barring any massive technological breakthroughs in the next few years, that's the way it's going to stay. There are certain laws of Quantum Mechanics that would need to be repealed before a video camera could ever equal the performance of a film camera. Electronic sensors haven't gotten all that much better over the last ten years, it's more that camera manufacturers have gotten better at disguising their deficiencies!
OK, if they want to fantasize about their career prospects, fine, but the Wikipedia is supposed to be about verifiable facts. It's proven almost impossible to sort out the facts from the fantasies, because as soon as you do, some loudmouth dreamer comes along and re-edits the page. the preceding unsigned comment is by 139.168.91.193 (talk • contribs) 03:32, November 12, 2005
Agreed
"It seems to be biased to the point that film is better than video. Each is its own medium."
Why does it state in the article:
"Given the constant year-on-year improvements in digital cinema technology, it appears that the long-term future of cinema is likely to be digital"
You can't have it both ways. Either one is better than the other because they are competing with each other for what will be the direction of the cinematic medium or one will not replace the other because they are seperate mediums. Either take the latter out of the article and replace it with the former or drop the complaint. the preceding unsigned comment is by Kasbrakistan (talk • contribs) 14:58, December 23, 2005
There are certain laws of Quantum Mechanics that would need to be repealed before a video camera could ever equal the performance of a film camera. The quantum efficiency of photographic film is well under 10%[1]. The QE of a modern image sensor is about 30% (wavelength and filter dependent -- sensitivity is even higher if the CFA's are not engineered into the sensor). The current crop of high-resolution still cameras from Canon (5D, et al) and Nikon (D200, et al) are are all basically photon noise limited [2]. I would expect that the motion cameras being made today are probably equal to, if not better than, these still cameras. Which is all to say: digital stuff already is better than film stuff.
This discussion -- and the rantish behaviour of the film freaks -- is old hat in the still photography arena. Just peruse the USENET. Heck, barely 5 years ago, people were still predicting a long life for still film photography, arguing on the basis that the image from a digital camera wasn't good enough ... and today we have announcements from Nikon they are shit-canning most of the their analog camera line, and trivially observed truth that few professional photographers are still using film on a large scale in the 35mm form (now even medium format is yielding). Even to digital afficionados the rate of developement is startling, breathtaking. The claim that it will take "10 to 20 years" for silver-halide film to be replaced by a CMOS/CCD sensor in the motion picture realm is frankly silly given this history. mdf 04:35, 14 January 2006 (UTC)

This article needs a lot of work

This article is currently a mixture of useful information and material that ranges from mistaken to nonsensical. This article desperately needs contributions from experts. -- Karada 22:04, 27 January 2006 (UTC)

What's worse is that people keep adding stuff about digital cinematography! I just don't have the time to merge it properly there, so I'm not deleting it, but it needs to go! And the above discussion is all about digital cinematography, too, not digital cinema. I don't think digital cinema should be associated with that kind of attitude. —Wikibarista 06:33, 3 February 2006 (UTC)

I made a very controversial statement (and can back it up)

I have made the statement that Ben-Hur's and Lawrence of Arabia's entire production budgets cost less than the amount of money spent on Superman Returns's CGI effects.

Superman Returns spends $100 million on special effects:

http://www.cinematical.com/2005/10/31/superman-returns-hits-250-million-picks-up-investor/

According to IMDB the production budgets for the major epics are:

Ben-Hur = $15,000,000

Lawrence of Arabia = $15,000,000

Adjusted for inflation (according to http://www.westegg.com/inflation) are:

Ben-Hur = $95,811,724.46

Lawrence of Arabia = $92,628,161.25

I will be the first to admit there is not much of a difference between $100 million and $95 million. However, this is a tremendous factor when we are talking about JUST THE SPECIAL EFFECTS!!! Also, Superman Returns was shot on digital. Where was the cost effectiveness there? They saved $2 million to spend over $200 million. This is before they spend well in to the nine figures promoting and distributing the film. Is this the final word in the arguement? No. But it should be considered when stating digital is more cost effective than film. The preceding unsigned comment was added by 71.139.33.98 (talk • contribs) 04:15, 16 February 2006.

How exactly did they spend over $200 million due to digital? Either way, digital didn't cause the producers/director to spend $200M on CGI and whatnot. Either way, there would have been special effects, and the special effects would have been more expensive with film-->digital-->special effects-->film (digital intermediate). —Last Avenue [talk | contributions] 04:26, 16 February 2006 (UTC)
It's all due to the directors wanting more special effects. It's not the fact that they're switching to digital, it's the fact that there are more big-budget and uber-special-effects movies. —Last Avenue [talk | contributions] 04:27, 16 February 2006 (UTC)

The person who posted the paragraph entitled 'Criticism' appears to be citing facts that have no actual relevance to the debate about the merits of digital filmmaking. He states that the cost of film production has risen incredibly, which is true, but what does that have to do with digital video which has been a valid format since Spike Lee's 'Bamboozled' which was released in October, 2000 -a little over five years ago? Since then only a few features have been filmed on digital video, and all of those would have cost much, much more if they were shot on PanaVision 35mm, or a comporable format. The writer then seems to oddly cite that because special effects, CGI in particular, are still high cost, that somehow that means any savings from the feature being shot in digital video are negated due to the CGI (!?) The same feature would still have those CGI effects on top of the already high costs involved with film. It just seems to be an illogical connect the dots, that makes zero sense!

What really baffles me to know end is when someone stupidly makes a comment like "The complaints for digital are fair game. Name me one digital film that is on par with Battleship Potemkin or Citizen Kane in cinematic art. Sin City? Revenge of the Sith? Maybe Sharkboy and Lavagirl? If you give me this stupid "you're just being subjective" b#llsh*t, I swear I will break something!!! Film only advanced by critism. You have to be able to prove digital will be able to create artistic masterpieces like Potemkin or Kane in order for us "film snobs" to take you seriously. Otherwise, shut up!!!" Ummm! Ok??? (scratches head!) You can make any film with digital video! Are you saying 'Citizen Kane' could not have been made on digital video, had the technology been available?? "Name me one digital film on par with 'Citizen Kane' HUH??????? What does that have to with anything at all? Creative minds create using the tools! The tools themselves do not create the film. The camera does not prohibit creativity, nor does it ensure it! This person seems to have a severe mental impediment, and I don't mean that as an insult. His comments seem to lack any real logic. Orson Welles could have made 'Citizen Kane' using HD DV, the script could have been developed on 'Final Draft' and edited digitally on a G5. It would still be the same film. You are not a 'Film snob' you an illogical, none too bright, quasi-luddite! The preceding unsigned comment was added by 24.34.179.235 (talk • contribs) .

What you don't seem to realize is that Orson Welles said himself "there has never been a good film made in color." He also denounced widescreen as a "bag of tricks." Now, I personally don't agree with these asumptions. However, he did have a point. Welles was not a big fan of super-high tech technology. He only switched to color when he had no choice (on his last two films). I bet you can't even name me these films of the top of your head. He was trying to make the point black and white Academy film still could achieve greatness. He made his film on black and white. Mind you, he went to RKO with a contract that gave him unlimited power for "Citizen Kane." On "Lady from Shanghi," he had a big budget with Rita Hayworth as his star. She was in plenty of Technicolor productions. So if he asked Harry Cohn (he was the head of Columbia, for all you digital know-it-alls), he could have easily obtained them. On "Touch of Evil," the studios were switching to color and he probably could have got Eastman stocks.
In all of those cases, did he persue color? NO!!! He realized black and white is different from color. This would have changed his entire films. They would have lacked the same look as in the older black and white format.
If his films had been in totally different in Color, imagine them in digital. There would be no flicker, no moving images through a projector, but a digital rendering of images from a computer program. No creative editing, no miese en scene. It would have been totally different.
So please, all you digital supporters, SHUT THE F#$K UP!!!!!!!!!!!!!!!!!!!!!! Go back to computer programming you do so well and stop acting as if you know about the art and craft of cinema!!!!!!!!!!!!! The preceding unsigned comment was added by 71.139.36.29 (talk • contribs) .

Stop Acting as if There is a Bee in your Bonnet!!!

You digital people are amazing! You say that digital will replace film. You talk of this utiopia where digital cinema produces better quality images. When we point out flaws in your argument (flaws we can prove), you jump up and down and say we are applying an unfair standard to digital. Right now, film is undenyably a better quality image than digital. All studies prove this. Many DPs have said digital will not have the same look as film. How an audience will respond (and which one they will embrace) is yet to be seen.

The complaints for digital are fair game. Name me one digital film that is on par with Battleship Potemkin or Citizen Kane in cinematic art. Sin City? Revenge of the Sith? Maybe Sharkboy and Lavagirl? If you give me this stupid "you're just being subjective", I swear I will break something!!! Film only advanced by critism. You have to be able to prove digital will be able to create artistic masterpieces like Potemkin or Kane in order for us "film snobs" to take you seriously. the preceding unsigned comment is by 71.139.61.28 (talk • contribs) 22:53, January 5, 2006 (PST)

If you really wanted a digital film on par with Citizen Kane, take Citizen Kane's film and use a conversion device to change it to 10000x5405 digital, and there's a digital film. —Last Avenue [talk | contributions] 04:59, 16 February 2006 (UTC)
The director creates the art, not the camera. Two (identical) great movies, one shot with a DV camera, and one with a film camera, will still be great. Last Avenue 00:42, 17 January 2006 (UTC)
So it's the talent's fault, not the technology? Are you saying that filmmakers today lack the talent that people like Renoir and Welles had? the preceding unsigned comment is by User:71.139.43.28 (talk • contribs) 19:22, 18 January 2006 (PST)
No, I never said today's filmmakers lack talent, nor was I saying today's films aren't as good. If someone thinks today's films suck, it is pretty much the talent's fault, and not modern technology. Last Avenue 02:26, 20 January 2006 (UTC)
This is where film and digital supporters differ. Film supporters believe digital causes a certain amount of laziness. It used to be you could only do so many takes because it would cost too much film stock and you would spend months editing. So you made the takes count. With digital, you can just keep shooting and shooting and never have to worry about stock supplies or editing. So filmmakers do not pay attention.
It would hardly cost more in film to go from 10 to 20. Sort of like arguing about wasting money by losing a quarter when both are losing millions elsewher (financial markets, etc.) If film directors started using 20 instead of 10 takes, then the price would go up from around $2M to $3M. The other $1M that didn't increase was the distribution/etc. costs. —Last Avenue [talk | contributions] 04:53, 16 February 2006 (UTC)
The many shots problem has occurred in current motion pictures where people just use electronic editing to edit. 20 takes are made for a shot (up from the usual 10). Also, scenes are just shot for the editing room, where the film is haphazardly made. If they switch to digital tape, it will get worse.
Also, CGI just never seemed real. I know I am not alone in this. Yet people keep using it instead of using good effects or beautiful scenery or great acting. Not as much work goes into it. Just a bunch of 1s and 0s.
Woot. Filmmakers can simply continue their old ways, using ten takes, no CGI, etc. Then it will still be the same '1s and 0s' as before? A simple conversion of an older film to 8192*3428 (2.39:1) or something similar will still yield '1s and 0s', yet still be the same. A frame of film can easily be summarized as '1s and 0s' and still look exactly the same. —Last Avenue [talk | contributions] 04:53, 16 February 2006 (UTC)
A lot was lost in the transition from Silent to Sound. While it proved beneficial in the long run, it took decades to get the visual style back and we have never fully achieved the style once so innovative in silent. Also, since black and white has essentially become illegal in film making, a lot has been lost as well in the visual style. Any transition to digital would also loose style. None of you digital supporters have given me any reason to believe such a loss would be beneficial in the long run. The preceding unsigned comment was added by 71.139.51.137 (talk • contribs) 05:33, 20 January 2006 (UTC).
"Loss of style in the long run?" The tiny costs of extra takes (see above) is hardly preventing directors from taking 20+ times. "A lot has been lost as well in the visual style" due to black/white to color? Why not wear a pair of colorblind glasses? 04:53, 16 February 2006 (UTC)

Off-the-cuff calculations

Some of the above comments are rather absurd, but also rather lacking in... detail.

Let's do some quick math for fun. Say we're shooting a two-hour feature film, with a comfy 8:1 shooting ratio (for every 8 feet of film we expose, 1 foot ends up screen). That's 3600*2*8 = 57,600 seconds, at 18 inches per second (24fps, 4-perf = 0.75in), comes out to 86,400 feet of film. A quick check of FotoKem's web site doesn't show a price list, but a little Googling turns up a price list for their services to the USC cinema school [3] which should give us a decent start. $0.10/ft for basic negative, another $0.20/ft for one-strike dailies... that's $25,920 to develop all our film and give us a complete set of daily prints to go over in preparation for picking what to edit with.

A little more Googling, here's some company selling filmstock [4]; a 1000-foot reel of, say, the 250D costs $705.64; so let's say $60,967.30 to buy all the film negative in the first place.

Even if we give a huge amount of wiggle room on these figures, we're looking at something on the order of $100,000 for a huge amount of film. Double it and we're still looking at a measly $200,000. Now, compare this to the costs of talent, labor, insurance, location fees, set construction, equipment rental, electrical, post, etc etc. These things add up, especially with the big names... ten million dollars, thirty million, a hundred million, two hundred million?

Now, if you're a no-budget amateur production, sure, that film cost is big, and since it's consumable you can't borrow it from a buddy for free. Going digital can be a big budget-saver for a small production. It can also be really convenient if you're a big-budget CGI-fest, as you can skip film scanning. But the cost of film over tape doesn't seem likely to cause a significant production to decide to run fewer takes; the cost of labor is going to be a much bigger factor.

Disclaimer: just an ex-film student, not a real producer. :) --Brion 08:58, 7 March 2006 (UTC)

The reason for the few more takes is the cost but rather editing equiptment. Before, with either a Movieola or flatbed, you had to actually cut the film. This took a long time and people tried to reduce editing time by taking less shots. Now, with digital editing equiptment, it takes less time to edit so people shoot more. An example: the movie Con Air shot over one million feet of film. They then hired 8 editors to cut the movie.

This is a problem the DGA addressed. People no longer have to map everything out. You shoot first and ask questions later.

This is getting ridiculous

I posted some warnings at the top of this talk page, hopefully to curb the abuse. From WP:CIV

Wikibarista 15:34, 10 March 2006 (UTC)

History and Distribution

I have submitted a section under history of digital cinema based on the request for verifiable information. Feeback is welcome.

Also, I have cleaned up distribution. Because this is still very hotly debated I recommend maintaining informative rather than 'position' statements. Feedback is welcome. — Preceding unsigned comment added by Averagejose (talkcontribs) 18:55, 19 May 2006 (UTC)

Split topic

Perhaps is would be best to split Digital Cinema into "Digital Filmmaking," the process of producing film via digital techniques, and Digital Cinema, a method of distributing and presenting motion pictures.

This would be best because movies shot on film may be distributed digitally, while films made digitally (including Star Wars Episode III, Sin City, etc.) have been primarily distributed through film prints. In fact, even if digital projectors become the standard many filmmakers like the look of film and will source material on film for digital projection. 216.64.26.114 11:44, August 26, 2005 (PST)

Agree. I'll put a proposal on the main page Fitch 08:44, 8 November 2005 (UTC)
Agree. There is already a lot here that overlaps with the digital cinematography article.--Onejaguar 00:27, 6 December 2005 (UTC)
The latest edit, on August first under economics, while good for formatting, highlights an older entry that really should be under digital cinematography. And has POV problems. It repeats as gospel truth the idea that DC is much cheaper than film production/distribution. StevenBradford 13:56, 3 August 2006 (UTC)

Quality of Digital Projection versus Film

Ok this is likely going to cause some controversy :D I would, however, like to see some discussion on the quality of digital projection.

Here's what NATO (National association of theatre owners) says: C) High Quality Levels Capable of Exceeding both Film and the Home Digital cinema is capable of achieving quality levels that exceed that of duplicated film, and capable of significantly exceeding that of the home. Image quality is associated with (generally in this order): Color space, Contrast, Resolution. Many have been impressed with the current TI 2K projector. This technology comes close to matching the color space of film, and well exceeds the color space of conventional HDTV. Its contrast is not yet that of film, but has improved significantly over the years. In comparison to the home, it well exceeds that of conventional NTSC television, although this comparison may be less favorable as consumers switch to new high-contrast digital HDTV sets. Resolution, however, is the one number that commands popular focus, and is easiest to market to consumers. In its studies, the ITU demonstrated that duplicated 35mm film has less resolution than HDTV (HDTV has an image resolution of 1920 x 1080 pixels). However, to differentiate the lower range of digital cinema from the consumer image format, DCI has specified a low end “2K” resolution maximizing at 2048 vertical lines by 1080 horizontal lines. An “upper-end” 4K resolution is also specified, maximizing at 4096 vertical lines by 2160 horizontal lines. While 2048x1080 is only slightly larger than the consumer HD 1920x1080 format, the 4K 4096x2160 format offers 4 times the number of pixels found in consumer HDTV. While 4K is the goal, the technology today is only proven for 2K. Sony is demonstrating a 4K projector at trade shows, but the demonstration has yet to match the color space or contrast of the TI projector, making the Sony projector an under-performer visually. The Sony projector has yet to be tested in a busy, metropolitan cinema that operates many shows daily. Even if a 4K projector were available, it would need a 4K server. As of this writing, no vendor has a 4K server on the market, or even in demonstration. Sony, notably, uses 4 servers to drive one projector in its trade show demonstrations. 4K technology is likely to be many years away from achieving theatre-level performance and operation. To insure single-inventory content in a 2K / 4K world, DCI specified a standard compression technology capable of handling both sets of image resolutions in one data file. Using JPEG2000, a compliant server can play a 4K image to a 2K projector by extracting the 2K version of the image from the 4K image file. The specification of JPEG2000 and the specific application of it for single-inventory content distribution was a significant milestone in the DCI process.

-Tying it with economics: A big part of digital cinema specifications is ensuring that the quality stays above that of home theatre, which is a competing market. Hence instead of adopting the 1920X1280 frame size of HD, d-cinema is going with 2K instead.

Glennchan 01:14, 7 July 2006 (UTC)

"As of this writing, no vendor has a 4K server on the market, or even in demonstration."
Not true. QuVIS demonstrated its Digital Cinema System 4K at NAB in April, and at IBC in September. QuVIS offers a 4K JPEG2000 mastering system, 4K servers, and 2K servers that can do a realtime extraction from 4K material. 4K is now only hampered by the shortage of projectors, and that will change soon.
PyroGuy 03:48, 26 September 2006 (UTC)

4K Digital TV for the home?

When will we see 4K monitors go on sale for the consumer market? —Preceding unsigned comment added by 72.67.35.112 (talk) 03:43, 8 January 2008 (UTC)

2K height

What is the typical height of 2K video? I'm very confused. Both RED Camera Systems and this website seem to show 2K at a height of 1152—the way it looks here: Image:Digital_cinema_formats.svg. Then this article describes projection at a height of 1080—same with this image Image:UHDV.svg. Photoshop CS3's built in presets for 2K (and 4K) are also different from the aformentioned. Anyone have any insight? Is it likely a pixel aspect ratio issue? TIM KLOSKE|TALK 01:48, 27 May 2008 (UTC)

Short: There is not "the" 2K standard, and especially not in image height.
Long: 2K, as in 2 kilo, regularly 2000 pixels, are here calculated base 2, therefore resulting in 2048 pixels. Some companies (like Arri) define the 2K as covering the entire 35mm film width, so after subtraction of the space for the optical sound track, they specify 1828 px as 2K. And since it is so close in resolution HDTV's 1920 px might sometimes also be referred to as 2K.
Then you have a great variety of image aspect ratios - 1:1.77, 1:1.85, 1:2.21, 1:2.35 … All will have different pixel heights, when their horizontal size is "2K". In addition, you might have pixel aspect ratio issues, as you can have some images anamorphic and some spherical. A DCI-2K (non-anamorphic) "Cinemascope" picture (1:2.35 [not mentioned the newer standard according to the ASC handbook of 1:2.39]) would measure 871 px in height; the same anamorphic image in 2K for print on film might give 1556 px (Arri) etc. --85.182.75.81 (talk) 00:06, 23 September 2008 (UTC)

Article for use

Here is an article in the Wall Street Journal that can be worked into the article:

Copied from a comment [5] on my talk page:

--Ronz (talk) 15:35, 13 July 2009 (UTC)

I've looked through each link, and I don't think any of them meet WP:RS. Mostly, this is just self-published information from Dvidea, or discussions about Dvidea in blogs. --Ronz (talk) 15:42, 13 July 2009 (UTC)

Explanation of VPF (Virtual Print Fee)

There are several references to "VPF", but no clear explanation of what exactly this is, or even what the acronym stands for. Apparently, in the Virtual Print Fee business model the distributor pays a third party to install digital cinema equipment in a theater. The theater pays the distributor the full price for each "print", and the distributor uses the money they save (since digital distribution is so much cheaper than traditional film prints) to pay the third party which provided the digital cinema equipment. Any experts willing to add something like this to the article?

I'm not familiar with Digital Cinema myself, but here some possible references:

suggested addition to "claims to significant events"

While earlier presentations have been recorded, it should be noted that the popular introduction of digital cinema to the public was on June 18, 1999 with the digital premiere of George Lucas's "Star Wars: The Phantom Menace." The movie was shown in four digital theatres in the US, two on the West Coast and two on the East Coast. These were Loew's Meadow 6 Theater in Secaucus, NJ, Loew's Route 4 Theater in Paramus, NJ, Pacific Theatre's Winnetka-21 in Chatsworth, CA and the AMC-14 theatre in Burbank, CA. Two different projectors were used, one on each coast: a prototype DLP-Cinema projector and a JVC ILA-12K projector. These projectors were driven by Pluto servers storing the movie in a lightly compressed format. Karagosian 05:15, 22 April 2007 (UTC)

I would like to propose the introduction of the Doremi Labs DCP2000 as a "significant event" in digital cinema. The DCP2000 was the first released server based on the evolving DCI specification. Its introduction put pressure on all other players in the market and in part ensured that the DCI spec would become a meaningful specification. The DCP2000 has retained the highest rate of installations, for almost 5 years, leading to a current installation base of over 14,000 units world wide.

Quvis should also be mentioned in the article as this was the most significant pre DCI digital server in use —Preceding unsigned comment added by 116.65.70.178 (talk) 00:04, 16 November 2010 (UTC)

Wrong abut red

""Multipurpose", and cameras capable of recording 5K, such as the RED EPIC" The red epic can't record in 5K - it captures STILLS in 5K. From RED.com "With the innate ability to capture 5K REDCODE RAW stills and 4K motion" — Preceding unsigned comment added by Joffboff (talkcontribs) 09:27, 16 December 2011 (UTC)

Typical cost to the theater for a digital print?

One piece of information I am looking for which is not in this article: what is the typical cost, to the movie theater, to buy a "digital print", that is to say, rent the digital file to load into the digital projector? I was briefly a movie projectionist, in 2005. I learned that the cost to the theater for one 20 minute roll of analog film is $6,000 (US dollars). So, for a 10 reel movie, like Alexander, the cost to the movie theater is $60,000 per copy of the movie. Does anybody have ballpark figures for the digital equivalent, and can put those figures in the article?68.186.59.182 (talk) 03:56, 27 December 2009 (UTC)

There seems to be misunderstanding here about the way that exhibitors are charged for the movies they screen. The charge is per movie, not per reel, and it is based on a percentage of the box-office income that the exhibitor earns from showing that movie. The percentage charged varies from movie to movie and also on how much time has passed from that movie's release to the start of the booking. DCPs (Digital "prints") are charged in exactly the same way and at exactly the same rates, ie. for a given title and time since release the same percentage is charged for a 35 mm print or a DCP.

However when a movie is taken as a DCP rather than a film print a fee known as a "Virtual Print Fee" (VPF) is paid by the distributor either to the exhibitor, or commonly to a third party organisation (called an integrator) who funded the installation of the digital projector. The amount of the VPF is determined per booking rather than being based on the box-office receipts earned. The amount of the VPF will be small compared to the rental fee but in most cases will be sufficient to pay-off the cost of the digital projector over a period typically of 5-10 years.

Davidlooser (talk) 09:24, 19 June 2012 (UTC)

Another point of view... 2K doesn't hack it

Digital projection will only be good when 4 K is the norm. 2K digital is like a Xerox copy of film, not quite up to snuff. For the forseeable future, films will be shot on film, and composited with digital DI technology if the effects demand it. On non-effects driven films, film will continue for production, and 2K digital will take over for all the cheaper theaters, to run them after about 2012. The better grade of theaters will run either bigger 6 perf. film or 4K digital to get a superior look, especially with 3D in the mix.

HD in the home with raise the ante, so that theater display will have to be superior to present or the admission prices will have to be cut by at least 40% to literally save the theaters in the digital age. the preceding unsigned comment is by Nativeborncal (talk • contribs) 21:05, December 25, 2005

Disagree
Despite my personal favor of as high a spatial image resolution as possible, as of 2008 the majority of film prints are (and have been) made from digital intermediates in 2K. 4K is used only for filigrane structures as smaller fonts with serifs for titles or recent big-budget productions like Spider Man 3. Also, the more close-ups of faces you have, or the more the image is moving (due to pans, dollys etc.), the less you will recognize lower spatial resolution. The only advantages you get with 4K in a regular production would be less visible pixels on fonts on title cards and finer reproduction of the film grain. --85.182.75.81 (talk) 23:37, 22 September 2008 (UTC)
Disagree
Having viewed quite a few 4K packages on Sony and then on DLP 2K projectors, it is near on impossible to visually determine a resolution difference. The only case I have seen is with fonts, usually subtitles. There is also the bit rate limitation. 250Mb/s is a good bit rate for 2K, but remember that 4k is 4 times the image resolution of 2K, so the compression effect is larger. For stills this may not be such an issue but for moving image, it does come into effect.
Then there is also the 3D matter. There is no 4K 3D as such. With 2K you have 125Mb/s per frame per eye. With 4K you could have 125Mb/s per frame per eye also, but that is per 4K image. So now the effects of the compression come into play in a big way. It is effectively 31.25Mb/s per 2k segment of the 4K image. —Preceding unsigned comment added by 116.65.70.178 (talk) 00:19, 16 November 2010 (UTC)
Disagree
As someone who works at an independant cinema that converted to 2K digital some 6 months ago I can honestly say that we are more than pleased with the picture quality from our Christie 2K projector, as are our audiences. The resolution is at least as good as that of the 35 mm release prints we used to get. Whilst 4K may be necessary on the really big screens of the old 70 mm houses, I very much doubt that the increased resolution of 4K would be apparent on our screen.

Davidlooser (talk) 19:49, 20 June 2012 (UTC)

Why is there a block on HD TV at the end of this article?

The article is about digital cinema. If there is a close enough relation between HD TV and digital cinema to warrant the presence of the block, that relation should be elaborated in the article. Dick Grune (talk) 21:33, 23 November 2012 (UTC)

The area of technology seems sparse to me.

I would really like more on the method of projection in this wiki. It is my understanding that the projectors use a strong light that passes through transmissive LCD panels. Are there different panels for each primary color (R,G,B)? Is it one panel? You get the idea.

I have developed a system that culminates three diode pumped lasers into a single white beam. I then use two spinning mirrors to create a raster that scans across a transmissive LCD screen. The light passes through and carries the image from the LCD to the screen. Being that laser light is a zero point source, it needs no lens to focus the image that it carries. You can see the project that inspired me to try this on instructables. http://www.instructables.com/id/Laser-Image-Projector/ —Preceding unsigned comment added by 71.180.98.22 (talk) 23:41, 2 October 2008 (UTC)

I came to the article looking for an understanding of the actual mechanics of the projection room, that is, how they differ from a normal film projector, and did not find anything concrete. The link to "Digital projector" was not too helpful either. In an ordinary "analogue" movie house the film is run through a device which directs intense light through the medium to project an image against a screen. What is going on in an actual projection room of a digital cinema? Orthotox (talk) 19:41, 20 January 2013 (UTC)

additional external resources for Digital Cinema Package... which I authored

Hi

there seem to be much more discussion going on here than on http://en.wikipedia.org/wiki/Talk:Digital_Cinema_Package, so I'd like to draw the attention of the community to my suggestion Talk:Digital_Cinema_Package#additional_resources to add two resources. Because I authored both resources I suggest and because I'm a new contributor, it might look as spam. I hope it won't because the resources are neutral and interesting, and because I won't be doing the edit myself ! cheers — Preceding unsigned comment added by Charbonstudio (talkcontribs) 07:42, 29 March 2013 (UTC)

Pros and Cons

Under cons we have:"Digital cinemas' efficiency of storing images has a downside. The speed and ease of modern digital editing processes threatens to give editors and their directors, if not an embarrassment of choice then at least a confusion of options, potentially making the editing process, with this 'try it and see' philosophy, lengthier rather than shorter.[40]:63 Because the equipment needed to produce digital feature films can be obtained more easily than celluloid, producers could inundate the market with cheap productions and potentially dominate the efforts of serious directors. Because of the quick speed in which they are filmed, these stories sometimes lack essential narrative structure."[40]:66–67 This doesn't belong anywhere in this article.Longinus876 (talk) 13:03, 26 April 2018 (UTC)

A Commons file used on this page or its Wikidata item has been nominated for deletion

The following Wikimedia Commons file used on this page or its Wikidata item has been nominated for deletion:

Participate in the deletion discussion at the nomination page. —Community Tech bot (talk) 09:59, 25 January 2021 (UTC)

Reason for redirection from "6K" ?

What is the reason for the creation of a redirect from "6K" to this lemma? --Angerdan (talk) 23:43, 9 April 2021 (UTC)

Digital Cinema Venues

Since the Digital Rev has started, is it appropriate to start a category of theaters using digital projection technologies? — Preceding unsigned comment added by Wcreswell (talkcontribs) 12:10, 19 October 2006 (UTC)