Talk:Digital Linear Tape
This article is rated C-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||
|
Servo
[edit]- "SDLT adds an optical servo system that reads servo patterns on the back of the tape."
I wish the article explained the significance of this sentence. The link to the servo article is useless in this context. — Johantheghost 14:19, 28 December 2005 (UTC)
Servo tracks are written on the back of tapes to keep the read/write heads on the correct data track. Newer tape media have very thin dense data tracks. 256, 384 and 768 data tracks on a half inch wide tape are now common. While moving from one reel to another when loaded in the tape drive, there may be some lateral tape movement that could cause the drive to 'lose track' of which data track it is reading. By putting servo tracks on the back side of the media, where it will not affect the actual data being stored, the servo system of a tape drive can tell precisely which track is being accessed and allows the tape drive to accurately position the tape media. TapeLady 06:26, 3 December 2006 (UTC) TapeLady
MB ?
[edit]What does MB stand for (in this article table)? mega bytes, mega bits? What kind of mega? 1000x1000 1000x1024 or 1024x1024?
Thanks?
1 MB = 10^6 bytes. The "mega" you mention is Mebi (MiB). Ralf-Peter (talk) 00:04, 4 March 2010 (UTC)
Uncompressed!
[edit]All capacities are uncompressed. Somebody added 80 GB as a DLTtape IV format. Like all drive names, the 80 in VS-80 stands for the compressed capacity using 2:1. Ralf-Peter (talk) 00:10, 4 March 2010 (UTC)
Standard, de facto or otherwise
[edit]DLT is a technology, not a standard (except in the sense that many people use DLT for backups). The specification is tightly controlled by Quantum and anyone who wants to make a DLT product can only do so with Quantum's approval. -- Austin Murphy 15:04, 23 October 2006 (UTC)
Compression factors
[edit]I agree that 2:1 compression ratio's are too high for most applications, however my experience over the last 10 years shows that 1.75:1 for a mixed environment of filsystems and databases is acheivable. The compression rate for pure user data has decreased with the increased number of pre-compressed files, however this seems to have been offset by the increase in un-used space in database tables (large sets of nulls tend to compress very well). I think that a link back to the Calgary corpus page http://en.wikipedia.org/wiki/Calgary_Corpus might be of benefit here.
If nobody objects I'll make the edit in a few days. —The preceding unsigned comment was added by Sharkspear (talk • contribs) 21:41, 18 December 2006 (UTC).
- Yes, 2:1 is a generally unrealistic compression ratio. However, I don't think it would be an improvement to substitute "2:1" with a different ratio. Each tape format should indicate the native capacity of the tape format and also indicate what sort of compression algorithm is available. For instance IBM says (in their mainframe literature) that their drives get 3:1 compression, Sony says that their AIT drives get 2.6:1 compression and HP says their drives get 2:1 compression. They all use the same (or very close to the same) algorithm! I think it would be a good idea to link to the Calgary Corpus, the Canterbury Corpus, or some other Comparison of compression algorithms page. Those pages should be updated with accurate compression results though. -- Austin Murphy 13:31, 19 December 2006 (UTC)
- Historically the 2:1 ratio once used to be realistic when data compression was not already part of the material. What I mean by that is that once upon a time you did not have JPEG and MP4 and compressed binaries. What you had were PCX and BMP and Word-DOCs and normal EXE files. In those times I regularily got about 2:1. The point was that people didn't like to use compressed data formats as long as their processors took longer to (de)compress than it took to read/write. So with faster processors and more memory we went from easy accessible formats like BMP to packed and compressed formats like JPG or compressed TIFF etc. And when that happened all the compressing backup solutions started to loose their attractivness. So the 2:1 imho was realistic at least at the beginning of this technology. (And no, I do not have a source for this because I don't need one to remember ;-) ...) JB. --92.193.149.220 (talk) 01:12, 20 March 2020 (UTC)
Types
[edit]Article needs to discuss differences in DLT media: what differentiates a DLT III from a DLT IV, for example. —Preceding unsigned comment added by 203.49.220.190 (talk • contribs)
- Please do. — RevRagnarok Talk Contrib 17:00, 10 February 2007 (UTC)
- Do you think I'd be asking if I knew the answer? —The preceding unsigned comment was added by 203.49.220.190 (talk) 09:32, 11 February 2007 (UTC).
- I have all that info. Maybe I'll type it in one day :) Ralf-Peter (talk) 00:11, 4 March 2010 (UTC)
- Do you think I'd be asking if I knew the answer? —The preceding unsigned comment was added by 203.49.220.190 (talk) 09:32, 11 February 2007 (UTC).
Citation needed...
[edit]For this statement: "In February 2007, Quantum stopped developing the next generations of DLT drives (S5 and V5) after insufficient market acceptance of the S4 and V4 drives, shifting its drive strategy to LTO."
How does the author figure that, I've tried searching but can't find it stated anywhere on the dlttape.com site...Maybe I'm missing something here... Thanks
Are you expecting an announcement from Quantum on their web site that DLT is now a dead-end technology, please stop buying any of it?
75.146.151.121 (talk) 21:00, 17 December 2008 (UTC)
I wrote that initially. I work for Quantum :) Ralf-Peter (talk) 00:06, 4 March 2010 (UTC)