Misplaced Pages

Talk:Non-linear editing

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.

This is an old revision of this page, as edited by Mark Kilby (talk | contribs) at 12:31, 25 January 2020 (History section: ~~~~). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Revision as of 12:31, 25 January 2020 by Mark Kilby (talk | contribs) (History section: ~~~~)(diff) ← Previous revision | Latest revision (diff) | Newer revision → (diff)

This page is stupid, pointless, and probably a corporate ad. There are no "linear editors", there haven't been for decades. Standard PC software is used for everything. "Non-linear" editing literally means nothing. Some salesperson (probably the one to write this article) made it up. Please delete this article or merge with video editing software — Preceding unsigned comment added by 45.20.198.8 (talk) 10:22, 27 October 2019 (UTC)

This article has not yet been rated on Misplaced Pages's content assessment scale.
It is of interest to the following WikiProjects:
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
WikiProject iconComputing Low‑importance
WikiProject iconThis article is within the scope of WikiProject Computing, a collaborative effort to improve the coverage of computers, computing, and information technology on Misplaced Pages. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.ComputingWikipedia:WikiProject ComputingTemplate:WikiProject ComputingComputing
LowThis article has been rated as Low-importance on the project's importance scale.
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
WikiProject iconFilm: Filmmaking
WikiProject iconThis article is within the scope of WikiProject Film. If you would like to participate, please visit the project page, where you can join the discussion and see lists of open tasks and regional and topical task forces. To use this banner, please refer to the documentation. To improve this article, please refer to the guidelines.FilmWikipedia:WikiProject FilmTemplate:WikiProject Filmfilm
Taskforce icon
This article is supported by the Filmmaking task force.
Please add the quality rating to the {{WikiProject banner shell}} template instead of this project banner. See WP:PIQA for details.
WikiProject iconProfessional sound production High‑importance
WikiProject iconThis article is within the scope of WikiProject Professional sound production, a collaborative effort to improve the coverage of sound recording and reproduction on Misplaced Pages. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.Professional sound productionWikipedia:WikiProject Professional sound productionTemplate:WikiProject Professional sound productionProfessional sound production
HighThis article has been rated as High-importance on the project's importance scale.
The contents of the Non-destructive editing page were merged into Non-linear editing on 2016-07-01. For the contribution history and old versions of the redirected page, please see Error: Invalid time. its history; for the discussion at that location, see its talk page.

External link?

Anon editor 209.164.32.131 (talk · contribs) added the following external link, http://www.digitalvideoediting.com. Other recently added links from this IP have been borderline spam, or at least a low quality links. Could someone more knowledgable about digital video editing please investigate. BlankVerse 13:46, 11 Jun 2005 (UTC)hiya


This is not a useful link in this case.

Firewire

It might be worth mentioning something to the effect that many computers come with Firewire as standard these days. For example, Apple Macs (Apples are widely used in media). Also, for some reason, a lot of Windows laptops come with Firewire as standard too. This is significant because it means that many users do not require any additional hardware to edit video on their computers. Stephen B Streater 12:10, 18 February 2006 (UTC)

Phrase added to the effect that Firewire is often available without extra hardware. Stephen B Streater

History section

The history since Avid in 1988 could be made more complete. One of the key issues in the industry during the early 1990s was that non-linear editing video was compressed using the JPEG format (called M-JPEG as lots of JPEGs looked like moving pictures). Apart from the extra cost of JPEG compression and playback cards, the JPEG datarate was too high for traditional systems, like the Avid on the Mac, to store the video on removable storage, so these used hard discs instead. The tape paradigm of keeping your (confidential) content with you was not possible with these fixed discs, and many productions had to delete their content after each edit, and then redigitise it again, if the Avid was being rented out to someone else or was not in a secure location. Also storage limitations of fixed discs meant big programmes were expensive to make.

These problems were all solved by a small UK company, Eidos plc (later becoming famous for its Tomb Raider video game series). By making use of the latest RISC-based computers from the UK, Eidos successfully implemented an editing system, launched in 1990 at IBC in Europe, which used compression in software designed for non-linear editing. The software implementation was much cheaper to produce, could decode multiple video streams at once for effects, and most significantly allowed (unlimited quantities) of cheap removable storage for the first time. Suddenly, the Eidos Edit 1, Edit 2 and later Optima systems allowed you to use any Eidos system, rather than being tied down to a particular one, and to keep your data secure.

Unfortunately for Eidos, Acorn stopped manufacturing their RISC PC in the late 1990s.

I am planning to add this to the history section. What do people think? Stephen B Streater 15:12, 1 March 2006 (UTC)

Added a refinement of this to the history section. Stephen B Streater 11:13, 3 March 2006 (UTC)

While I'm about it, Lightworks needs a proper mention too. Stephen B Streater 11:48, 3 March 2006 (UTC)


hello there
some thoughts about the following paragraph from the 1990s-section to improve the history-part

"Until 1993, the Avid Media Composer could only be used for editing commercials or other small content projects, because the Apple Macintosh computers could access only 50 gigabytes of storage at one time."
I don't get it... 50GB of storage... in 1993?

"In 1992, this limitation was overcome by a group of industry experts led by Rick Eye a Digital Video R&D team at the Disney Channel."
By whome... "a group of industry experts led by Rick Eye" OR "a Digital Video R&D team at the Disney Channel"?
Or was the r&d-team led by this Rick guy?
And btw... who is Rick Eye?

"By February 1993, this team had integrated a long-form system which gave the Avid Media Composer Apple Macintosh access to over 7 terabytes of digital video data."
Once again... terabyte... in 1993?

" The system made its debut . Within a year, thousands of these systems replaced a century of 35 mm film editing equipment "
What system? The "long-form system" (from the "terabyte-quote")? What does that mean for a semi-professional?

greetings from Germany — Preceding unsigned comment added by 93.218.180.208 (talk) 00:06, 22 August 2012 (UTC)

Hi - I've taken the liberty of adding a paragraph to the history section to reflect the influence cloud-computing is having on NLE systems. I have a definite start point on the timeline of 2004 for the first system. Availability has grown since the first system so think it's a legitimate part of the story. mk (talk) 12:31, 25 January 2020 (UTC)

Quality section

There is some incorrect information in the Quality section... " Although this can be avoided by decompressing DV before making alterations and keeping the resulting video in an uncompressed state" This is not true. You cannot "decompress" video, it's not like a ZIP file that you can bring back to its original size. DV is compressed 5:1 by the camera when it's recorded to tape and you cannot get any less compressed than that. Even if you try to "upconvert" it by bringing it to a timeline that uses a higher quality codec you will never get it to be better than the original 5:1 compression

I think the point he was trying to make is that there is no further loss during editing when, for example, you have layered effects and recompress each time a new layer is added, giving generational loss. If the video is not recompressed each time as DV, but held in uncompressed format, you only lose one extra level when you output the final version. Perhaps this could be made clearer - would you like to have a go? PS If you sign the talk pages with 4 ~s, people will know where your comment ends and the next one begins. Stephen B Streater 16:03, 17 March 2006 (UTC)

I added that information (not signed in at the time) - Stephen has it right. Essentially, if the DV is resampled as uncompressed (or 1:1 in Avid-speak) video then further changes (most frequently DVE moves, text and disolves) do not have to be recompressed into DV, a process which induces heavy digital degredation. If the image is shifted or resized the resulting misalignment of the existing DCT macroblocks can cause very noticable picture artifacts. I will see if I can rephrase it to better capture this fact. Sycophant 22:10, 21 March 2006 (UTC)

This looks like an improvement. It's worth mentioning that fast machines can handle DV in real time in software now. For example, my PC and my Mac can both decompress DV from a Firewire port or a file, recompress in Forbidden's format, and upload to FORscene, all in real time. The standard Linux drivers are not real time yet though. Stephen B Streater 22:54, 21 March 2006 (UTC)
The real issue is that, without hardware additions, PC and Macs cannot output anything other than DV, so without that hardware, everything has to be recompressed to DV on the way out. Sycophant 23:19, 21 March 2006 (UTC)
This is a good point - but not entirely accurate. Using FORscene or Clesh (which my company produces), PCs and Macs can output web pages with hosted Java video on them, as well as publish video for download on to mobile phones or video iPods. So it is not quite true to say that PCs and Macs can only output DV! I do agree that the only broadcast quality output format is DV though, and this is one popular use of NLE. Stephen B Streater 07:12, 22 March 2006 (UTC)
It's true that in our, umm, "digital age," the concept of outputting is not just video signal flow anymore. I can save 2K uncompressed video to my firewire drive (if my computer is fast enough or I don't do RT) - and then maybe take it to a facility and downrez it to HDCAM, or filmout. So I guess I output 2K from my software-only PC. My point is, I think this issue is irrelevant. With digital you're just moving bits around.
p.s. DV is not considered a broadcast quality format. (That doesn't mean that a lot of DV stuff isn't broadcast on TV :-) Binba 05:40, 14 May 2007 (UTC)

Now, while I am at it, I wonder if there should be any discussion on the techniques of non-linear editing? Sycophant 23:19, 21 March 2006 (UTC)

I like the idea of a new section on NLE techniques. There are some things in the Film editing article which may be relevant here. Stephen B Streater 07:12, 22 March 2006 (UTC)

The following has been added since I last looked at the page:

Ultimately it depends on the DV Codec being used. Sony's Vegas DV codec, for instance, has been touted as able to endure up to 50 generations of recompression (in the same codec) until compression artifacts become noticable to the average human eye.
DVCPRO HD has been shown to endure five generations of recompression before perceptual changes occur; though, even 15 generations of recompression only displays modest compression-related loss under 3x magnification.

Which I don't think is really accurate.. The quality of the DV codec, while helpful, cannot avoid the issues introduced by recompression, especially where that video has undergone position shift, scaling or colour alteration. DVCPRO HD is a different issue - as it is 100Mbit, with ia much more forgiving compression profile, but 25Mbit 4:2:0 or 4:1:1 DV video will take a quality hit from any recompression (straight cuts do not require recompression, so you pass unaltered DV into and out of a computer forever really). I really don't believe Vegas' DV codec, or any other, can mitigate this issue to any real degree. If I take a DV clip and move it four pixels to the left, the recompress, output, recapture and repeat, it will look seriously damaged within 3 generations regardless of the codec. Sycophant 01:44, 29 August 2006 (UTC)

I would really like to see some citations of the following claims. I am pretty familiar with the nature of DV compression, and entirely unconvinced that one codec will have significantly better performance than another - they are all compressing to the same standard - Ultimately it depends on the DV Codec being used. Sony's Vegas DV codec, for instance, has been touted as able to endure up to 50 generations of recompression (in the same codec) until compression artifacts become noticeable to the average human eye.

DVCPRO HD has been shown to endure five generations of recompression before perceptual changes occur; though, even 15 generations of recompression only displays modest compression-related loss under 3x magnification.

Also, the term 'generation' is basically meaningless in DV - a clip should be able to be moved back and forth infinately from a tape to edit application. As long as the image is not manipulated in anyway the data is unchanged (or should be, some apps do recompress at ingest for some reason). I have personally tested this to 15 'generations'. However as soon as the image is changed (try a four pixel horizontal shift, or a 105% resize) and the image will under go serious alteration. --Sycophant 21:25, 2 October 2006 (UTC)

I think you are probably right, but DV25 and DV50 are different, and HD formats are better quality after the same amount of manipulation (ie not cuts). Some codecs are not nilpotent ie recompressing the same data twice gives a different result - MPEG2 is an example, I think, because of the motion compensation. I haven't tested whether the quantisation in DV always gives the same result after a "generation loss", but it is conceivable that the quantisation differs each time and ultimately ends up with mush. Also, if DV is a format (like MPEG), different codecs might give different qualities and still play back on the same player. Stephen B Streater 21:51, 2 October 2006 (UTC)
DV25 and DV50 are different indeed, and I will leave DVCProHD alone as it's a whole different thing again. However, focusing on DV25, as it is by far the most common... Most DV software should be nilpotent (nice word). The quantisation is defined as part of the DV standard(IEC 61834 it seems) and should be constant. Most editing software (all the main apps now for sure) will handle cuts-only DV without need for transcoding. Generationally there should be no change at all, bitwise, to DV25. This is certainly the case with Avid Xpress Pro (which I have tested to 10 generations with no change at all). I am not sure about the variance of DV codecs - they all follow the same DCT pattern, have the same Audio and Video datarates. I do not know what differences exist, but I am fairly confident (at least for DV25) that the limitations of the DCT quantisation cannot be overcome - especially once DCT blocks become misaligned, but also with colour changes. Sycophant 09:07, 11 October 2006 (UTC)

I have removed the parts of the Quality section that referred to the quality of the Sony Vegas DV codec, and DVCPRO HD - as I can find no supporting evidence for the Vegas claim, which generally doesn't seem to be supported by the nature of 25Mbit DV compression. DVCPRO HD is an entirely different format, and it's performance doesn't really seem relevant to the section. --Sycophant 08:02, 18 March 2007 (UTC)

NLE now available at low cost

I've added free web applications - less work than a download, cross platform, and no OS purchase necessary. An example is my own company's Clesh, though with the growing number of web applications, others are sure to be available / arrive soon. Stephen B Streater 06:57, 13 April 2006 (UTC)

Amiga info missing

There should be mention of the Amiga video editing systems, which provided low-end semi-professional video editing and effects capabilities in the early 90s. I'm no expert, so I don't think I can add this in myself ... but I know that Amiga dominated the low-end niche now occupied by Windows desktop video suites.

Amiga-based video editing was put out in late 1990, for so much less than other NLE systems that average people could afford it compared to any other out there at $1495.00, compared to others at $100,000 +. But it also came with a 3D package known as Lightwave and was used extensivly in Babylon 5, SeaQuest, But it was also a live switcher used for local TV stations everywhere. It also came with a video paint program.

It was not limited in hard drive size as was the Apple, but limited only in the amount of drives you could hook to a bus. (limited at time to 7 on a bus, and had 2 buses for video and 1 for audio. An 18-drive system (up to 6 drives per SCSI bus, with the 7th device in the SCSI chain as the bus itself) was possable if not for the price of drives (9 gig for $1000 if I remember right).

Contradiction

Non-linear editing for film and television postproduction is a modern editing method that involves being able to access any frame in a video clip with the same ease as any other. This method was inherent in the cut and glue world of film editing from the beginning but, in the world of analogue film, editing was a destructive process. It was also linear.

The last sentence contradicts what came before. Was analog cut-and-glue editing non-linear or linear? AxelBoldt 21:27, 10 September 2006 (UTC)

IMO analog cut-and-glue editing is linear.--H.T. Chien / 眼鏡虎 (Discuss|Contributions) 05:48, 11 September 2006 (UTC)

Traditional film editing (literally cutting and taping) is a non-linear process really. You can cut, at any point, insert footage and have the following footage ripple downward. This is not possible in linear video editing, where once footage is on the edit master it can not be moved. It can be overwritten but never moved. With film cut editing, you can very literally cut a piece of film out of one place and move it to somewhere else and splice it in. This is non-linear. --Sycophant 06:48, 2 October 2006 (UTC)

"A-Team"?

The reference currently cited for "A-Team" no longer exists. Is there any other support for this claim? If not, it should be deleted. Rocinante9x 15:02, 13 March 2007 (UTC)

No. Per WP:REF#What to do when a reference link "goes dead": it should be repaired or replaced if possible, but not deleted.. Cate | Talk 15:25, 13 March 2007 (UTC)
Incorrect. What it actually says is, "If the source material does not exist offline, and if there is no archived version of the webpage, then the citation should be removed and the material it supports should be regarded as unsourced." 86.183.24.235 (talk) 17:50, 6 March 2011 (UTC)

TLDR

I think this article could do with splitting up into sections, either chronographically or by subject. It's a bit ironic you can't read different sections making it a bit linear, unlike the subject matter. hrf (talk) 22:16, 19 July 2009 (UTC)

Quality

The quality section is becoming somewhat out of date, with this launch in 2009 and articles like this one in Streamingmedia.com talking about it in February 2010. In particular: It is also trialing a new codec, called Osprey, which delivers loss free compression and is claimed to eliminate the generational loss caused by putting footage through conventional editing systems. Loss free compression gives the option of having no generation loss. Stephen B Streater (talk) 22:05, 12 March 2010 (UTC)

Non Linear

According to the article, "non-linear" seems to mean "software." Does it really have nothing to do with lines, or specificallly the fact that most NLE's (excepting Microsoft's Movie Maker for instance) can accept more than one track/line?

Linear editing is characterised by linear assembly of a program by sequentially recording each edit onto a master videotape. Making changes to already recorded portions of the program was to be strenuously avoided or worked around by not changing the overall length of the segment to be changed, lest the entire program have to be rerecorded (under computer control using an edit list).

Non-linear editing utilizes random-access media so that edited sequences may be previewed without any recording. This enables potentially greater artistic freedom of expression. — Preceding unsigned comment added by Jmtindel (talkcontribs) 04:12, 16 October 2016 (UTC)

The history is detailed but there is very little on typical functionality, screen layout, limitations. This article might be called "the history of NLE's" rather than the current title. --Timtak (talk) 17:41, 31 May 2010 (UTC)

It has nothing to do with "lines" at all. It is a reference to the fact that on older video tape, one could not "skip" to any point in the middle or end, one had to travel through the tape until one got to that point. "Linear" basically means that one thing follows the other.
In non-linear editing, which is characterised by the use of software, the material is no longer stored on tape, but on hard disks, which means it's just binary data, so you can jump to any part of the footage without having to scrub to that point. This is basically a result of the way computers store, and access, data.
The "track" you mention can refer to one of two things: the physical areas on tape where data is stored, or the sections on a timeline (NLE) that "house" that various components of the video. You can have multiple video tracks, stereo audio is usually housed on two tracks (one for each side), etc. This is refered to as "multi-track" editing.41.133.62.97 (talk) 13:42, 27 October 2010 (UTC)

Advertisement

This sentence seems somewhat out of place...possibly a corporate ad?

An example of this is NewTek's software application SpeedEdit. Billed as the world's fastest video editing program, this application is an example of the continual streamlining and refinement of video editing software interfaces. —Preceding unsigned comment added by Teeks99 (talkcontribs) 15:19, 29 April 2011 (UTC)

What non-linear editing is not.

The article misses the essence, which is not that non-linear editing gives you random access to the source material but that it allows you to assemble the product in a non-linear, random fashion. With old style linear (video) editing the product was assembled from beginning to end, in that order. One could replace/overwrite sections of video and/or audio, but never cut something out or insert extra material. Non-linear editing removes that restriction, allowing one to assemble the product in any arbitrary order. Fenke (talk) 12:21, 3 May 2011 (UTC)

New history comments

Descriptions of each of the common editing paradigms (Moviola, Kem flatbed, CMX video) and the various non-linear approaches (Montage's specialized hardware UI, EditDroid's flatbed on a Sun workstation, Avid's evolution from filmstrips on an Apollo workstation) would provide useful background and context. — Preceding unsigned comment added by Jmtindel (talkcontribs) 04:33, 16 October 2016 (UTC)

Jmtindel@ These are valid points. Misplaced Pages is written by knowledgeable people like you. We need your help. Please help fill in the blanks and expand the world's knowledge of the subject. Write it yourself. The one thing you need to is find some sources, like other websites, or books, that verify what you have to say. Feel free to contact me or to work with me to help input your knowledge into this and other related articles. Trackinfo (talk) 17:43, 16 October 2016 (UTC)

photo

How a photo like Videowisconsinstudio.tif is good photo for distinguishing linear (Linear_suite.jpg) and non linear editing systems? Or this is not the point of the photo placed at the very top of such an article? And who can ensure us that this is not a linear editing studio? (I see consoles with monitors which ensures that a linear editor is used) --176.92.87.25 (talk) 08:02, 24 June 2017 (UTC)

If we wanted to show linear editing we'd show something like File:Showchron Film Editor.jpg. ~Kvng (talk) 13:05, 27 June 2017 (UTC)

Nyc Makhanyi325 (talk) 20:12, 4 November 2019 (UTC)

Proposed Merge

It seems like the current page for video editing software relies heavily on information that is provided in the NLE system article. Even though the former article could be improved and possibly diverged, I think it would be better to merge the two. Wikierudite (talk) 06:11, 7 August 2019 (UTC)

Are you proposing to merge Video editing software into Non-linear editing system? Why do you think a merge would make things better? ~Kvng (talk) 14:59, 10 August 2019 (UTC)
Categories: